Open Test Framework  
Unit Tests with OTX

With the OTX development environment (OTF) you can program in OTX. In order to ensure the quality of the programmed OTX code, so-called unit tests can be programmed in OTX. This is the task of the OTX UnitTest Extension.

Note: The task of unit tests is to protect the programmed test logic against the expected behavior.

The following main functions are supported by the OTX UnitTest Extension inside OTF:

  • Tests can be executed inside and outside OTF. Inside the OTF in the Test Explorer and outside with a stand-alone application with and without (console) a GUI.
  • Tests can be run in parallel
  • Support of data driven tests
  • Supports selective running of test cases inside OTF (Test Explorer)
  • Creation and comparing of human readable test reports inside OTF
  • Enables implementing test-driven development
  • Allows writing & running independent tests
  • Supports the complete simulation of the environment, including the vehicle communication (No hardware needed)
  • Supports parametrizable Test Procedures
  • Supports easy to write TestCases for the expected behavior
  • Checking of expected exceptions
  • Supports of Assertions, Assumptions etc.

This section gives a short glance about the fundamentals of software testing. For a deeper insight into the topic of "software testing", please refer to the relevant specialist literature, e.g. ISO 29119 "Software Testing".

  1. Basics of Software Testing
  2. Writing a Unit Test in OTX
  3. Checking Expected Behavior
  4. Environment Simulation
  5. PDU Simulation
  6. Test Execution

Related Topics:

Note: The OTX UnitTest Extension was developed by EMOTIVE and is not part of the OTX standard. However, this has no impact on the standard compliance of the delivered OTX sequences, as the unit test cases are not saved together with the OTX test logic.

Note: The UnitTest Extension has no influence on the normal OTX test logic and is stored separately from it.

Basics of Software Testing

The so called "test pyramid" shows the different types for software verification.

There are essentially the following types of tests. The higher you get in the pyramid, the more expensive it gets and the longer the tests take.

  1. Unit Test
    The unit tests test the smallest possible, independent component.
  2. Integration tests
    Integration tests test the interaction between several components in a system, usually based on specific use cases such as flashing or coding.
  3. End-to-end tests (E2E)
    End-to-end tests test the entire system in an environment that is as realistic as possible.

A Unit Test tests the functionality of the smallest units of a software. The aim is to check each individual component independently and isolated from other data and external influences after each change and thus ensure the quality of the software.

Note: Unit Tests are also known as a regression tests.

A good unit test is based on the so called FIRST principle, see table below. It has to be fast and independently. It must always lead to the same result when repeated. It must be self-validating, meaning it must either pass or fail and the best thing to do, is to write the test before implementation.

Requirement Description
F-ast Fast test execution to run as often as possible
I-ndependent Independent to parallelize test execution
R-epeatable Must always lead to the same result when repeated
S-elf-Validating A test must either pass or fail
T-imely It is best to write the test before implementing the code (Test Driven Development)

Note: Before writing a test, think about what really should be tested! Don't write senseless tests!

Test Methods for Unit Tests

There are essentially two test methods for unit tests:

  1. Black Box Test
    The specification-oriented test method is also known as “black box test”, since it always tests against the specification. Black box tests are robust against changes and offer good protection against regression, but they always have to be created manually.
  2. White Box Test
    The white box test always looks at the internal structure of the code. In contrast to black box tests, white box tests can be generated automatically. But they are very vulnerable to changes.

Equivalence Classes

In OTX mostly black box tests are used. To write a good unit test the following two steps should be taken into account:

  1. Equivalence Classes
    Create classes of value ranges for the parameters for which expects the same behavior, the so-called equivalence classes. Then a representative for each class must be found and a test case must be written for it. The goal is high test coverage with as few test cases as possible.
  2. Border Analysis
    The border analysis is an equivalence class at the borders, since the errors are occur more often at the borders.

Writing a Unit Test in OTX

To write a test, the UnitTest extension provides the following elements, among others:

Element  Description
TestProcedure A TestProcedure is the main element of a test and represents a single test step
parallelizable The parallelizable attribute marks the TestProcedure as "can be executed in parallel"
repeat The repeat attribute specifies if a TestProcedure should be executed multiple times (Default = 0)
retry The retry attribute specifies if a TestProcedure should be repeated in case of a failure (Default = 0)
timeout The timeout attribute specifies a timeout for the TestProcedure execution (Default = no timeout)
TestCase The TestCase specifies the selected parameters and the expected results of a TestProcedure
expected With expected the expected value of a certain out parameter can be set inside a TestCase
exception With exception the TestCase expects a certain Exception during test execution. Exception inheritance will taken into account.
ValueList With the ValueList a TestCase parameter can be set to an arbitrary list of possible values. For each value in the list an implicit TestCase will be created, see sequential attribute.
Range With a Range a TestCase parameter can be set to a range of possible values (MinValue, MaxValue, Step). For each value in the range an implicit TestCase will be created, see sequential attribute.
ignored The ignored attribute marks the TestCase as ignored for the TestProtocol, see Test Protocol Explorer (Default = not ignored)
sequential A TestCase marked as sequential reduces the number of implicit generated TestCase, see ValueList and Range. No combinations of TestCases are generated if multiple ValueList or Range elements are present in a TestCase (Default = not sequential).
errorMessage With the errorMessage a certain message can be assigned to TestCase which will be included in the report if the TestCase fails (Default = empty)

The following example shows the test of an integer division with the OTX UnitTest extension in OTL syntax.

// Mark the following procedure as test procedure
[Test]
// Test case: Normal Division
[TestCase(dividend = 10, divisor = 2, expected quotient = 5)]
// Test case: Division with a negative sign
[TestCase(dividend = -10, divisor = -2, expected quotient = 5)]
[TestCase(dividend = -10, divisor = 2, expected quotient = -5)]
[TestCase(dividend = 10, divisor = -2, expected quotient = -5)]
// Test case: Division with remainder
[TestCase(dividend = 11, divisor = 3, expected quotient = 3)]
// Test case: Division at range borders
[TestCase(dividend = 9223372036854775807, divisor = 2, expected quotient = 4611686018427387903)]
[TestCase(dividend = -9223372036854775808, divisor = 2, expected quotient = -4611686018427387904)]
// Test case: Division from zero
[TestCase(dividend = 0, divisor = ValueList(1, -1, -9223372036854775808), expected quotient = 0)]
// Test case: Division with zero
[TestCase(dividend = ValueList(10, -10, 0), divisor = 0, exception ArithmeticException)]
// Ignored test case
[TestCase(sequential, ignored, dividend = Range(-10, 10, 1), divisor = 0, exception ArithmeticException, errorMessage = "Ignored")]
// The test will repeated one time (Default = 0)
[Repeat(1)]
// In case of an error the test will be repeated one time (Default = 0)
[Retry(1)]
// The test procedure shall be finished after 1 second (Default = 0)
[Timeout(1000)]
// Can be executed parallel
[Parallelizable]
// Test procedure with some parameters
IntegerDivision(in Integer dividend, in Integer divisor, out Integer quotient = 0)
{
// Calling the to-be testet OTX procedure
DoSomething(dividend, divisor, out quotient);
}
// Example for very complicated OTX test logic, here performing a division
private procedure DoSomething(in Integer dividend, in Integer divisor, out Integer quotient = 0)
{
quotient = dividend / divisor;
}

First, an OTX procedure with input parameters Dividend and Divisor and output parameter Quotient must be marked as Test Procedure inside a Unit Test Project. The mark in OTL code above is expressed with [Test]. Within the procedure, the division is simply performed.

In the first test case (in OTL code above see [TestCase(...)]) a normal division was described: 10 divided by 2 with the expected result 5. This corresponds to the Normal division equivalence class. The second equivalence class is the Negative Division for which three test cases be needed. The next equivalence class is the Division with Remainder: 11 divided by 3 gives 3 and not 4, since the remainder is always cut off. The next step is to go to the range borders and checks Division from Zero, which must always result in 0. Finally, it is checked whether an ArithmeticException is thrown when Dividing with Zero.

The attribute [Parallelizable] marks the test case as independent and thus parallelizable.

Note: A Test Procedure is like a normal Procedure and can contain any OTX code.

Checking Expected Behavior

In the previous example, the expected behavior was checked implicitly in the TestCase using the keyword expected. However, the expected behavior can also be tested explicitly within the OTX logic. The following activities are available for this:

Activity  Description
Assert This is an Action. Evaluates an condition. If it evaluates to false, the execution of the TestCase will be stopped and the Test Results is set to FAILED.
Assume This is an Action. Evaluates an condition. If the condition of the assumption returns false, the Test Results of the Test Procedure is set to INCONCLUSIVE. The execution of the test procedure containing this assumption returned to the caller. No node after this node will be executed.
Warning This is an Action. Evaluates an condition. If the condition of the warning returns false, a warning for this Test Procedure will given back to caller. The warning will be written into the test protocol. The execution of the test procedure containing this warning will continued with the next node after the warning.
AssertThrows This is an CompoundNode. If the given exception will not be thrown inside the flow, the result of the Test Procedure is set to FAILED. The execution of the test procedure containing this assertion returned to the caller. No node after this node will be executed.
Fail This is an EndNode. The Test Results of the Test Procedure is set to FAILED. The execution of the test procedure returned to the caller. No node after this node will be executed.
Ignore This is an EndNode. The Test Results of the Test Procedure is set to IGNORED. The execution of the test procedure returned to the caller. No node after this node will be executed.
Inconclusive This is an EndNode. The Test Results of the Test Procedure is set to INCONCLUSIVE. The execution of the test procedure returned to the caller. No node after this node will be executed.
Pass This is an EndNode. The Test Results of the Test Procedure is set to PASSED. The execution of the test procedure returned to the caller. No node after this node will be executed.

The following example shows the explicitly checking of expected behavior via Assert and AssertThrows:

[Test]
// Test procedure with one TestCase (If no TestCase is specified, the test procedure will be executed once)
TestProcedure1()
{
String response;
// The OTX procedure "DoSomething" should be tested
DoSomething("Hello World", out response);
// Checks the right response to the given request
Assertion.Assert(response == "Sky is blue", "Wrong response");
DoSomething("?", out response);
// Checks the right response to the given request
Assertion.Assert(response == "Don't understand", "Wrong response");
// Checks if the UserException will be thrown
UnitTest.AssertThrows(UserException)
{
DoSomething("", out response);
}
}
// Simple OTX procedure "DoSomething" to show the fundamental behavior
private procedure DoSomething(in String request, out String response) throws UserException
{
if (request == "Hello World")
{
response = "Sky is blue";
}
else if (request == "")
{
throw UserExceptionCreate("WrongRequest", "I can only understand \"Hello World\"");
}
else
{
response = "Don't understand";
}
}

Test Results

Each TestCase execution results in one of the following test results. All test results will be written into a Test Protocol Explorer.

Test Result  Description
DISABLED The test was disabled if the TestProcedure or the TestCase is marked as DISABLED. No subsequent elements will be executed.
Use case: For incomplete or wrong test cases or test procedures.
PASSED The test was passed. The test procedure execution was successful. This will only happens if all expected results of all test cases are fulfilled, all Assert and Assume activities fulfill its condition or a Pass end node was executed.
FAILED The test execution was failed. This can only happens if an expected result of a test case does not match, an Assert activity does not fulfill its condition or a Fail end node was executed.
IGNORED The test was ignored. The default result of a test procedure is IGNORED, if this will not explicit changed via a test case or an action or and node or a disabled attribute. A test case can be explicit set to ignored, if the TestCase is marked as IGNORED or a Ignored end node was executed.
Conditions:
- Corner Case - This should be used if an known problem occurs but only in test conditions, which are not relevant for practical use cases
- Wrong Specification - If the test case is right but the specification is wrong
- Instable Test Case - The test case will be executed right, if it was executed stand alone, but if it executed inside the entire test suite it can happens sometimes (not every time!) that the test case failed. E.g. this can happens, if the environment is unstable and the problem behind is unknown.
Note: A test case wich is marked as IGNORED will be executed, but if it fails, the result is IGNORED and not FAILED.
Note: This result has no influence to the overall result of the parent element.
Note: An IGNORED test case should contain a understandable message, why it was ignored.
INCONCLUSIVE The test execution was inconclusive. This can only happens if an Assume activity does not fulfill its condition or a Inconclusive end node was executed. The behavior is similar to FAILED except that an assumption and not a result does not match.T

Environment Simulation

A TestProcedure should, if possible, be self-contained and executable independently of the current environment. The following so-called CallbackProcedure are supported for this purpose, see table below. If a CallbackProcedure is defined for an activity, for example the ConfirmDialog of the HMI extension, then the ConfirmDialog CallbackProcedure and not the actual activity is called when simulation is switched on (see StartSimulation and StopSimulation).

Note: Inside the CallbackProcedure, the desired behavior can be simulated using any OTX code.

The following main functions are supported:

The following table lists all activities that can be used for environment simulation:

Activity  Type Description
ContextVariableGetValue CallbackProcedure The reading of a ContextVariable can be simulated
StateVariableSetValue CallbackProcedure The setting of a StateVariable can be simulated
ConfirmDialog CallbackProcedure The ConfirmDialog action of the HMI extension can be simulated
InputDialog CallbackProcedure The InputDialog action of the HMI extension can be simulated
ChoiceDialog CallbackProcedure The ChoiceDialog action of the HMI extension can be simulated
ShowDocumentDialog CallbackProcedure The ShowDocumentDialog action of the HMI extension can be simulated
OpenScreen CallbackProcedure The OpenScreen action of the HMI extension can be simulated
CloseScreen CallbackProcedure The CloseScreen action of the HMI extension can be simulated
HighlightScreen CallbackProcedure The HighlightScreen action of the HMI extension can be simulated
ExecuteDeviceService CallbackProcedure The ExecuteDeviceService action of the Measure extension can be simulated
CreateProvider CallbackProcedure The CreateProvider action of the ExternalServiceProvider extension can be simulated
GetProperty CallbackProcedure The GetProperty action of the ExternalServiceProvider extension can be simulated
SetProperty CallbackProcedure The SetProperty action of the ExternalServiceProvider extension can be simulated
ExecuteService CallbackProcedure The ExecuteService action of the ExternalServiceProvider extension can be simulated
GetServiceProviderEventValues CallbackProcedure The GetServiceProviderEventValues action of the ExternalServiceProvider extension can be simulated
TerminateService CallbackProcedure The TerminateService action of the ExternalServiceProvider extension can be simulated
DisposeProvider CallbackProcedure The DisposeProvider action of the ExternalServiceProvider extension can be simulated
ParameterChanged CallbackProcedure The change of a parameter (e.g. ScreenParameter or DeviceServiceParameter) can be simulated
RaiseScreenClosedEvent Action Raises an Event inside the related ScreenClosedEventSource
RaiseDeviceEventSource Action Raises an Event inside the related DeviceEventSource
RaiseServiceExecutionFinished Action Raises an Event inside the related ServiceExecutionFinishedEventSource
RaiseServiceProviderEventSource Action Raises an Event inside the related ServiceProviderEventSource
SetCallbackParameterValue Action Used inside the CallbackProcedure: Sets the value of a signature parameter (like a ScreenSignature, DeviceServiceSignature etc.)
GetCallbackParameterValueAsString Term Used inside the CallbackProcedure: Reads the value of a signature parameter (like a ScreenSignature, DeviceServiceSignature etc.)

Note: The simulation of the environment must be started via StartSimulation. The OTX runtime will then call the CallbackProcedure for the related actions. If no CallbackProcedure for the related action exists, a NOP operation will be performed. The simulation can be stopped via StopSimulation.

The following example shows the simulation of the ConfirmDialog action of the HMI extension:

[Setup]
SetupProcedure1()
{
UnitTest.StartSimulation();
}
[Test]
[TestCase(title = "", message = "", messageType = @MessageType:INFO, expected result = @ConfirmationType:YES)]
[TestCase(title = "Test", message = "", messageType = @MessageType:INFO, expected result = @ConfirmationType:CANCEL)]
[TestCase(title = "", message = "Test", messageType = @MessageType:INFO, expected result = @ConfirmationType:CANCEL)]
[TestCase(title = "", message = "Test", messageType = @MessageType:ERROR, expected result = @ConfirmationType:NO)]
[Parallelizable]
TestProcedure1(in String title, in String message, in HMI.MessageType messageType, out HMI.ConfirmationType result)
{
HMI.ConfirmDialog(message, title, messageType, result);
}
[Callback]
confirmDialog ConfirmDialog1(inParam String title, inParam String message, inParam HMI.MessageType messageType, resultParam HMI.ConfirmationType result)
{
Integer sum;
sum = (StringUtil.LengthOfString(title) + StringUtil.LengthOfString(message) + StringUtil.LengthOfString(ToString(messageType))) % 3;
if (sum == 0)
{
result = @ConfirmationType:NO;
}
else if (sum == 1)
{
result = @ConfirmationType:YES;
}
else
{
result = @ConfirmationType:CANCEL;
}
Logging.WriteLog(@SeverityLevel:INFO, NULL, StringUtil.StringConcatenate({"Title = \"", title, "\", Message = \"", message, "\", MessageType = ", ToString(messageType), ", ConfirmationType = ", ToString(result)}));
}
[TearDown]
TearDownProcedure1()
{
UnitTest.StopSimulation();
}

The following more complex example shows the simulation of the OpenScreen action of the HMI extension together with detecting of parameter changes and raising events.

// Screen signature which describes the screen
private HMI.ScreenSignature ScreenSignature1(
term String title,
term String message,
ref Integer progress,
out Boolean result
);
// Global variable
private Integer Progress;
[Setup]
SetupProcedure1()
{
// Starts the simulation. The related callback procedures will be called.
UnitTest.StartSimulation();
}
[Test]
[TestCase(title = "Sky is blue", message = "Hello World", expected result = true)]
[TestCase(title = "Sky is blues", message = "Hello World", expected result = false)]
TestProcedure1(in String title, in String message, out Boolean result)
{
Integer progress;
EventHandling.EventSource screenClosedEventSource;
HMI.Screen screen;
Boolean isRunning;
// Opens a screen
HMI.OpenScreen(screen, ScreenSignature1, {title = title, message = message, progress = progress, result = result}, false);
// Register the screen closed event
screenClosedEventSource = HMI.ScreenClosedEventSource(screen);
// Parallel changing the progress screen parameter and waiting for closing the screen
parallel
{
lane
{
isRunning = true;
// Increment the progress and wait 10 ms
while (isRunning) : WhileLoop1
{
progress = progress + 1;
EventHandling.Sleep(10);
}
}
lane
{
// Waiting until screen is closed
EventHandling.WaitForEvent({screenClosedEventSource});
// After screen closed event received stopping the while loop above
isRunning = false;
}
}
}
[Callback]
openScreen OpenScreen1(inParam String handle, inParam String screenSignatureName, inParam Boolean isModal)
{
String title;
String message;
Boolean result;
Integer progress;
// Gets the screen parameter values
title = UnitTest.GetCallbackParameterValueAsString(handle, "title");
message = UnitTest.GetCallbackParameterValueAsString(handle, "message");
progress = ToInteger(UnitTest.GetCallbackParameterValueAsString(handle, "progress"));
result = ((StringUtil.LengthOfString(title) + StringUtil.LengthOfString(message)) % 2) == 0;
// Polling every 10 ms until progress exceeds the value 9
// Instead of polling the ParameterChanged CallbackProcedure can be also used, see below
while (progress < 10) : WhileLoop1
{
// Reads the value of the progress
progress = ToInteger(UnitTest.GetCallbackParameterValueAsString(handle, "progress"));
// Write into log file
Logging.WriteLog(@SeverityLevel:INFO, NULL, StringUtil.StringConcatenate({"handle = ", handle, ", ScreenSignatureName = ", screenSignatureName, ", title = ", title, ", message = ", message, ", progress = ", ToString(progress), ", result = ", ToString(result)}));
EventHandling.Sleep(10);
}
// Sets the out parameter to a certain value
UnitTest.SetCallbackParameterValue(handle, "result", result);
// Raises the screen closed event
UnitTest.RaiseScreenClosedEvent(handle);
}
[Callback]
parameterChanged ParameterChanged1(inParam String handle, {inParam String name}, {inParam String valueAsString})
{
// The changing of the global variable can be detected by EventHandling.MonitorChangeEventSource
Progress = ToInteger(valueAsString);
Logging.WriteLog(@SeverityLevel:INFO, NULL, StringUtil.StringConcatenate({"handle = ", handle, ", name = ", name, ", valueAsString = ", valueAsString}));
}
[TearDown]
TearDownProcedure1()
{
// Stops the simulation and switches back to the normal activity behaviour
UnitTest.StopSimulation();
}

PDU Simulation

PDU Simulation means simulation of the diagnostic communication, see picture below. The OTF supports an in-built PDU Simulation. For a complete simulation, the diagnostic communication of the DiagCom extension must be simulated independently of the environment. This is the task of the PduSimulation extension. The PduSimulation extension extends OTX to simulate the diagnostic communication at the level of a PDU. PDU means ProtocolDataUnit and expresses the HexByteField of a Request or a Response. This makes it possible to individually influence the diagnostic communication for each test case or to explicitly generate errors.

Note: With the PduSimulation extension a TestCase can be written and executed independent of the underlaying vehicle communication. No hardware is necessary!

Related Topics:

The following main functions are supported:

  • Supports CAN and DoIP
  • Supports physical and functional addressing
  • Supports VariantIdentification and VariantSelection
  • Automatic generation of TesterPresent
  • BatteryVoltage (KL30) and IgnitionState (KL15) can be set
  • Processes simulation files from DiagRA-S
  • Can import a simulation from a DiagLogging file
  • A simulation can be loaded from a file, modified or completely rebuilt manually within a TestProcedure
  • Wildcards and placeholders can be used (e.g. Request = "36 X1 ..", Response = "76 X1")
  • Multiple responses can be assigned to a request. They are processed in a rolling manner.

Note: The system requirement for using the extension is that the used D-PDU API supports the simulation option.

Note: The extension may only be used within UnitTest projects.

The following table list all related activities:

Activity  Type Description
LoadSimulation Action Loads or imports a PDU simulation from a file
ResetSimulation Action Reset to state after loading the simulation
Start Action Starts the PDU simulation
Stop Action Stops the PDU simulation
SetVariantName Action Set the value of ecuVariantName
SetDoIPLogicalAddress Action Sets the logical address of the gateway or DoIP-Edge-Node to the given byte field
SetDoIPVin Action Sets the given string as VIN number for DoIP communication.
SetBatteryVoltage Action Sets the current battery voltage in milliseconds as a Float value
SetIgnitionState Action Sets the current state of the ignition clamp.
SetPdus Action Set one or more responses to a certain request (Can contain wildcards and placeholders)
IsStarted Term Checks the simulation was started or not

The following more OTL code example shows the simulation of the of diagnostic communication via PduSimulation extension.

[Setup]
SetupProcedure1()
{
if (PduSimulation.IsStarted())
{
PduSimulation.Stop();
}
PduSimulation.ResetSimulation();
PduSimulation.SetVariantName("LL_AirbaUDS", "EV_AirbaAU10BPAAU736_005");
PduSimulation.SetPdus("LL_AirbaUDS", "22 F1 9E", {"62 F1 9E 45 56 5F 41 69 72 62 61 56 57 33 31 53 4D 45 41 55 36 35 78 00"});
PduSimulation.SetPdus("LL_AirbaUDS", "22 F1 A2", {"62 F1 A2 30 30 31 30 33 34"});
PduSimulation.SetPdus(
"LL_AirbaUDS",
"22 F1 AF",
{"62 F1 AF 00 01 00 0B 04 02 50 00 0A 00 08 05 00 00 00 0B 00 08 00 09 03 00 0C 00 1E 04 21 08 00 15 00 08 06 01 00 00 1D 00 1E 02 21 03 00 1F 00 1E 04 20 03 00 20 00 1E 03 14 03 00 23 00 1E 01 27 03 00 24 00 1E 04 06 01 00 32 00 1E 03 24 04 00 33 00 1E 03 19 07 00 34 00 1E 01 05 03 00 35 00 08 01 01 00 00 3C 00 1E 04 19 08 00 3D 00 1E 03 11 04 00 3E 00 1E 03 12 07 00 50 00 1E 03 07 02 00 51 00 1E 03 19 10 00 52 00 1E 03 03 00 00 8D 00 1E 01 14 03 00 CF 00 2E 01 03 01 00 2A 00 1E 01 05 03 01 00 00 2F 01 07 01 12 00 00 01 01 09 00 00 6E 00 01 01 12 00 00 8E 00 1E 01 15 04 00 8C 00 1E 01 18 08 D3 00 00 1E 02 16 03 00 0F 00 1E 06 01 00"}
);
PduSimulation.SetPdus("LL_AirbaUDS", "22 F1 87", {"62 F1 87 39 39 32 39 35 39 36 35 35 44 21"});
PduSimulation.SetBatteryVoltage(12000.0);
PduSimulation.SetIgnitionState(true);
PduSimulation.Start();
}
[Test]
TestProcedure1()
{
String Param_DataRecor1;
DiagCom.ComChannel comChannel1;
String EcuVariantName;
PduSimulation.SetIgnitionState(false);
comChannel1 = DiagCom.GetComChannel("LL_AirbaUDS", NULL, true);
EcuVariantName = DiagCom.GetComChannelEcuVariantName(comChannel1);
DiagCom.ExecuteDiagService(DiagCom.CreateDiagServiceByName(DiagCom.GetComChannel("LL_AirbaUDS", NULL, true), "AUTOSAR_Identification_Read"), {}, {}, NULL, NULL, false, false);
DiagCom.ExecuteDiagService(
DiagCom.CreateDiagServiceByName(DiagCom.GetComChannel("LL_AirbaUDS", NULL, true), "DiagnServi_ReadDataByIdentVWSparePartNumbe"),
{},
{PR_DiagnServi_ReadDataByIdentVWSparePartNumbe.DataRecord.SparePartNumbe_Param_DataRecor = Param_DataRecor1},
NULL,
NULL,
false,
false
);
}
[TearDown]
TearDownProcedure1()
{
PduSimulation.Stop();
}

Test Execution

Once the Unit Test Project has been created, it can be executed in various ways. Within the OTF in the Test Explorer or outside using the included stand-alone application Unit Test Execution. The tests can be integrated into your own CI/CD chain using the Unit Test Execution Console Application application.

Using different Test Context, the same Unit Test Project can be executed under different environment settings.

Related Topics: