test automation for verifying software s detectability for rule violations
Download
Skip this Video
Download Presentation
Test Automation for Verifying Software’s Detectability for Rule Violations

Loading in 2 Seconds...

play fullscreen
1 / 13

Test Automation for Verifying Software’s Detectability for Rule Violations - PowerPoint PPT Presentation


  • 58 Views
  • Uploaded on

Test Automation for Verifying Software’s Detectability for Rule Violations. Name: Zhishuai Yao Supervisor: Pro. Jukka Manner Place: Varian Medical Systems Finland Oy. Outlines. Overview and background Objectives of the thesis Design and implementation Results and conclusions Q & A.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' Test Automation for Verifying Software’s Detectability for Rule Violations' - zaria


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
test automation for verifying software s detectability for rule violations

Test Automation for Verifying Software’s Detectability for Rule Violations

Name: Zhishuai Yao

Supervisor: Pro. Jukka Manner

Place: Varian Medical Systems Finland Oy

outlines
Outlines
  • Overview and background
  • Objectives of the thesis
  • Design and implementation
  • Results and conclusions
  • Q & A
overview and background
Overview and Background
  • This thesis is done in a company which develops software for radiation therapy in cancer treatment
  • Automated tests are created to verify the error detecting mechanism (“checking functions”) in the software
overview and background1
Overview and Background
  • Radiation therapy
    • Using radiation beam to irradiate the tumor.
    • Requires high accuracy at tumor positioning and dosing the treatment
overview and background2
Overview and Background
  • Treatment planning system (TPS)
    • Computerized application used for simulate the dose distribution in CT images
    • Various of inputs to the TPS increases the risk for radiation therapy
objectives of the thesis
Objectives of the Thesis

Implementing the tests is to:

  • High level: reducing the risk in radiation therapy
  • Low level: eliminating the errors in the TPS by verifying the “checking functions” in the application
design and implementation
Design and Implementation
  • Testing target: “checking functions”
    • For each specific violation of rule, checking function throws error or warning message to notify the user
  • Testing method: “black box testing”
    • Generate faulty cases to violate every predefined rules and check whether the correct error or warning message is throw by the “checking function”
design and implementation1
Design and Implementation
  • Challenges
    • Understand each rule (requirement) and find the proper parameter to violate the rule
    • Setting the criteria to the test
    • Short execution time and reusability (e.g. for regression testing)
design and implementation3
Design and Implementation
  • Test procedures:
    • Importing prerequisite data
    • Running the checking function for original data
      • No error or warning should be thrown
    • Modify specific parameter
    • Running the checking function again
      • Expected error or warning should be thrown
    • Log the result
results and conclusions
Results and Conclusions
  • Automated test has covered 93 rules (requirements) by the time this thesis was finalized (currently more than 120 )
results and conclusions1
Results and Conclusions
  • Associated warning or error is not shown.
  • Non-related warning or error is shown in addition to the correct warning or error message.
  • Corruption in data model dependency rule.
  • Some of the mandatory attributes are not correctly configured in the system.
ad