100 likes | 214 Views
This paper explores the use of evolutionary algorithms to systematically guide the search for test data within equivalence classes. By leveraging the concept of equivalence classes in test case design, we aim to enhance the test methodology, improving test coverage while reducing the number of defined test cases. The proposed solution focuses on automating the derivation of test data and systematically exposing errors. Key applications include automated requirement violation detection and optimization of test data. The resulting methodology addresses complexity challenges and promotes efficient testing practices.
E N D
Using evolutionary algorithms to guide search for test data within equivalence classes Felix Lindlar
Autobahn Speed = 240km/h, 300km/h, 350km/h ??? Tester …accelerate the car to autobahn speed wait for a short period of time and then…
Problem Description • Tester often think in Equivalence Classes when designing test cases • Selection of appropriate test cases based on experience • Many times not systematic • Loss of information • Simple Examples (from industrial test specification):
Idea • Using equivalence classes instead of samples for test case specifications • Examples: Advantages • Higher test coverage of requirements • Extension of test methodology • Definition of test cases similar to (higher level) requirements (i.e. weaker) • Less test cases have to be defined higher level of abstraction Disadvantages • Methodology and tools for handling equivalence classes not yet developed New approach required
Proposed solution • Idea • Extending test methodology to cover equivalence classes • Using evolutionary algorithms to guide search within equivalence classes • Enhance existing test cases • Advantages • Higher level of automation • Higher data coverage of requirements • Systematic exposure of errors • Systematic search within equivalence classes • Make equivalence classes manageable • Challenges • Find use cases (complex/simple functions) • Deal with complexity (multi dimensionality) • Generate objective function from existing analysis mechanisms • Test ending criteria
Field of Application • Automated search for requirement violations (counter example generation) No errors found with traditional approach, but parameter tuning within equivalence classes uncovers faults • Test data optimization (selection of representatives, increase quality of test cases) Not only violations of requirements but also extreme values (e.g. runtime) of interest • Generating test data for subsequent test stages Using valuable HiL testing time efficiently, reuse of critical test data identified by evolutionary engine (MiL test) • Increase test coverage of requirements • Integrate idea with TPT, a tool which is used at Daimler
gas pedal MiL SiL PiL HiL brake pedal wheel speed ECU ... TPT – A quick introduction • Platform Independence • Abstraction • Test Automation • Piketec GmbH • Embedded Systems Testing • Test Modeling
assessment description Testassessment fully automated scenario description data logs Test execution TPT Virtual Machine Platform adapter SUT platform specific TPT Test Process ? ??? test results Tester Test modeling Test documentation
Level 3 Level 2 Level 1 Structural Test of TPT-Assessments • Trying to break requirements (red squares) • Examples: • - “Temperature( PartXY ) > 300° Celsius” • - “Operation NOT finished after 10s” • Evolutionary tuning of parameters within equivalence classes to cover desired branches CFG TPT Assessment