1 / 55

Virtis-Opis Beta Testing

Virtis-Opis Beta Testing. Todd S. Thompson, PE South Dakota DOT Office of Bridge Design August 3, 2011. How does this work?. So you want to be a Beta Tester? Or maybe you are a Beta Tester? How do I fit in the development process? What is Beta testing? What is Alpha testing?

Download Presentation

Virtis-Opis Beta Testing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Virtis-OpisBeta Testing Todd S. Thompson, PE South Dakota DOT Office of Bridge Design August 3, 2011

  2. How does this work? • So you want to be a Beta Tester? • Or maybe you are a Beta Tester? • How do I fit in the development process? • What is Beta testing? • What is Alpha testing? • Am I in the wrong presentation?

  3. AASHTOWARE • Standards and Guidelines http://www.aashtoware.org/Documents/07012010%20SnG%20Notebook.pdf • Policies, Guidelines and Procedures http://www.aashtoware.org/Documents/PGP-October2009final.pdf

  4. Lifecycle

  5. Lifecycle

  6. Testing & More Testing • Contractor Testing • Unit • Build • System • Alpha • Task Force Testing (Beta TAG) • Beta

  7. Contractor Testing • Unit testing • Is performed on each class/object or program module to reduce the complexity of testing the entire system. • Build testing • Is the means used for testing a component that is made up of lesser components (units or other builds).

  8. Contractor Testing • System testing • Is used to test the system as a whole and to ensure that the integration is completed and that the system performs as required. • Ensures that all user and system requirements are met • Allows the contractor to prepare for alpha testing and ensures that the product is complete and ready for formal alpha testing and the subsequent acceptance.

  9. Contractor Testing • Alpha testing • Covers the same system and system components as System Testing. • The emphasis is, however, on breaking the system, checking the user and system requirements, and reviewing all documentation for completeness by using the application as if it were in production.

  10. Contractor Testing • Alpha testing complete • Contractor documents everything is complete, tested and meets all the requirements • Task Force accepts and approves the start of Beta Testing

  11. Beta Testing • Beta TAG role • Confirm operational and functionality requirements are satisfied • Operates in users’ environment • Review of documentation • Confirm ready to implement

  12. Beta Testing • Beta TAG Chair/Contractor • Confirm all the various user environments are covered • Operating Systems • Databases • Design vs Rating

  13. Beta Testing Cycle • Installation problems are reported • Contractor resolves installation issues and re-distributes material, as necessary • Testers perform their work and report problems • Contractor validates and fixes problems

  14. Beta Testing Process • Test using YOUR bridges • Report Bugs to Support Center • Confirm bugs were fixed in subsequent release(s) • Recommend (or not) Acceptance Release is ready for production

  15. What to test? • Test variety of your typical bridges • Migrate your database • Copy of your production DB? • And/Or copy of your “testing DB”? • Test just like you were going to migrate to the next version in production

  16. What to test? • Confirm bridges with valid results are still valid • Review new enhancements and confirm they work

  17. Test or Check? • Is there a difference? • What is the difference of testing software or checking software?

  18. Checking Software • Is confirming existing beliefs • Is a validation or verification • Is often done by programmers as they work – making sure they don’t break something • Is focusing on making sure the program doesn’t fail

  19. Testing Software • Something done with motivation to learn something new • Process of exploratory, discovery, investigation and learning • Trying to find limits • Trying to find everything about the program that works and does NOT work

  20. Checking vs Testing • Checking provides a yes or no • Checking answers the question “Does this assertion pass or fail?” • Testing is open ended • Testing is “Is there a problem here?”

  21. Checking vs Testing • Checking doesn’t let us know if the program works – it only lets us know it’s working within the scope of our expectation(s) • Testing is, in part, finding out if our checks are good enough

  22. Checking vs Testing • Testing involves some checking, but if it only involves checking, it will be poor testing • Testing involves curiosity and exploration and can’t be automated – it involves human intervention

  23. Checking vs Testing • If we find a problem in testing, the checks need to be revised so that the problem doesn’t happen again • Automated testing is really automated checking • Any “testing” that is looking for a yes/no or an expected result is actually checking

  24. Find a bug? • Document any and all errors or bugs you find

  25. Documenting • Confirm your problem/issue • Provide the bridge XML • Explain how it happens • Screenshots of error or problem • Explain what it should be • Engine, DB, OS, etc

  26. Test New Features • Review new Enhancements • Plan out what you can test • E.g. if you don’t have floor trusses – don’t plan on testing that portion.

  27. Global Testing • Check Final results of various bridges • I have various groups of bridges that have “accepted” results • Compare those RF’s with new release(s) • Use of spreadsheet(s) to quickly compare results • Look at the “final” result and compare

  28. Global Testing • Quick and fairly easy way to feel comfortable with new release • Your bridges • Your results • Good comparison • Identify potential problems

  29. Global Testing • Compare Final LF Rating Results with previous version • Brass LF 6.2 vs AAHSTO LF 6.3 • HS 20 • INV RF = 1.108 (Version 6.2) • INV RF = 1.087 (Version 6.3) • Only 2% difference – say ok

  30. Global Testing • Life is easy when everything matches • But what to do when you get differences? • Confirm same bridge data (same input) • Confirm bridge properties • Confirm same load effects (DL and LL) • Confirm capacities • Confirm rating factors (if load rating)

  31. Global Testing • When you get differences – you need your detective hat • Think about what input affects the results and where the difference could be found • Spec Change? • Corrected previous error/bug?

  32. Global Testing • The more data you can confirm • The better you can isolate the difference • Greatly reduces the time the developer needs to try to figure out what and where the problem is and get a successful fix.

  33. “Local” Testing • Looking into the deep and dirty details of the analysis • Comparing Moments, Shears, Deflections • Comparing M, V, & Service Capacities • Reviewing Specifications and their details • Reviewing other details as necessary • Very time consuming effort at times

  34. Enhancement Testing • Testing out new features in the application • FY 2011 Work Plan • FP1 User Enhancements • Truss LL Distribution Factor

  35. Truss LL DF

  36. Truss LL DF

  37. Truss LL DF

  38. Truss LL DF

  39. Truss LL DF

  40. Truss LL DF

  41. Truss LL DF

  42. Truss LL DF • Compare LL DF’s • Hand Calcs vs Virtis Calcs • Single Lane M LL DF • 0.865 vs 0.87 • Multi Lane M LL DF • 1.7725 vs 1.77

  43. Truss LL DF • Hand Calcs vs Virtis Calcs • Both used AASHTO 3.12.1 for multi lane reduction factors • Both used AASHTO 3.6.2 for 12 ft lanes • Both used 2 ft from curb line to first Wheel • Everything agrees with hand calcs • Accept this Enhancement as correct

  44. Other testing areas • Reports • Data Correct • Labels match data • Missing data • Import/Export • Bridge and Library XML files • Conversion works ok, if necessary • Bridge rates ok after migration – same results

  45. Other testing areas • Administration • Adding/Removing users • Revising their permissions (groups) • Other Configuration revisions • Libraries

More Related