1 / 20

Trns•port Test Suite Project

Trns•port Test Suite Project. Tony Compton, Texas DOT Charles Engelke, Info Tech. Topics. Charlie will provide overview Genesis of project Strategic plans for testing Tony will give the details Just completed Phase 1 Phase 2 underway Next steps. What is the Trns•port Test Suite?.

byrd
Download Presentation

Trns•port Test Suite Project

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Trns•port Test Suite Project Tony Compton, Texas DOT Charles Engelke, Info Tech

  2. Topics • Charlie will provide overview • Genesis of project • Strategic plans for testing • Tony will give the details • Just completed Phase 1 • Phase 2 underway • Next steps

  3. What is the Trns•port Test Suite? The elements needed to perform thorough and affordable testing of all of Trns•port Plans a uniform, integrated test plan for all Trns•port modules (today there are many separate test plans) Data a single comprehensive test database covering the entire Trns•port project lifecycle Tools that enable automated testing both during development and at agency sites

  4. Project Genesis – the 2001 Strategic Directions Presentation • Theme: Focus on Processes • Process changes are key to support Trns•port’s growing scope without breaking the bank • First Target: Testing • Traditionally, each Trns•port component had developed its own test database and procedures • We now need a single unified test database and methodology • The benefits of this are very large, and surprising broad

  5. Original Goals • Develop and package a standard test environment • Database(s), templates and interface files • Tools to reset all to initial states • Create automated regression tests • GUI is hardest, so do it last • Add testing to the installation procedures • Use testing environment for all regular builds during development • Provide tools for agency acceptance testing

  6. Record/Replay Test Method • Prior testing was mostly task driven • Task analyzed, script written, user follows script • Expensive to design and perform • Recording actual use is easier • Looks at sessions, not tasks • User starts the application, performs many typical tasks, recorded by new tools • Replaying is easier still • Tool replays every action in the session, compares results to recorded responses

  7. Some Potential Applications • Database platform validation • Use proxy to record database traffic • Replay proxy recording to new database • Continual regression testing • Maintain a large library of recordings • After every new build, replay entire library • Agency acceptance tests • Replay recordings made by Info Tech at your own site with a delivered test database • Make your own recordings under old release, replay against new installation

  8. Start with the Servers • Windows programs are hard to record and replay • The user can “talk” to the program in lots of ways • But servers are generally driven by a single connection point to the client • ODBC for databases, HTTP for all others • This is a well defined target for recording • And proxies, a common tool for other purposes, can record as they proxy

  9. 2001 Recommendation • Develop phased work plan • Create basic test database, develop test scripts, create ODBC proxy and recorder, create HTTP proxy and recorder, develop automated build and test environment, package end-user test tools, expand test database and scripts • Proceed as aggressively as possible • Benefits will start to accrue very early

  10. Motivation – Improved Quality • Much more extensive regression testing can be performed much more often than would otherwise be practical • Automation ensures repeatability of procedures, documentation of results • Results are immediately visible to all interested parties • Potentially including licensees via web • Likely to improve unit testing before code check-in

  11. Be Optimistic, but Also Realistic • Test database requires deep analysis • GUI automation requires a proxy for a human being – hard to do! • Test data and scripts require careful maintenance • Source management procedures must be uniform to automate • Response to replays can have many minor variations, so test logs require human interpretation

  12. Some Things The Test Suite Won’t Do • Run standard tests against agency data • Recordings can only be replayed against the exact same database to be meaningful • Migrate recordings to new releases • Some recordings will remain valid in new releases, some won’t. New recordings would have to be made in those cases • Always give a “Pass” or “Fail” on a test • Unless replay is identical in every way to the original, a person will have to examine deviations for significance

  13. Progress on the Test Suite • Components • Phases • Processes • Lessons Learned

  14. Trns•port Test Suite Components • Test Plan • Test processes • Test cases and Test Case Repository • Test Database • Spanning all platforms, all modules • Test Automation Framework • Automated execution of regression tests • Learning and replaying tests

  15. Multiple Phases Needed • Phase 1 – “Low Hanging Fruit” • General framework • Automated batch testing • HTTP proxy record and playback • Phase 2 – Expand Scope • More of everything in Phase 1 • Investigate ODBC and GUI issues • Phase 3 – Complete Toolset • Covers all planned areas • Not “finished”, always maintain and expand

  16. Phase 1October 2002 to June 2003 • Focused scope • Covers only PES and LAS • One contract, five functions per module • Oracle 8i on Windows • Results • Test plan and database • Testing framework • Automated batch testing • HTTP record and playback tools • Accepted by TTF

  17. Phase 2April 2003 to December 2004 • Broaden scope • Finish PES/LAS test plans • Cover all other client-server modules • 20 contracts, 10 functions per module • Some high volume test cases • Cover other databases • Results • ODBC proxy prototype • Agency testing tools (verify behavior matches release tests at Info Tech)

  18. Phase 3 • All of Trns•port • Add Estimator, SitePad, etc. • More test data and volume testing • Results • Test database representative of three years of production use • ODBC and GUI testing • Binary data covered • More Agency testing tools (record/playback)

  19. TRT and ITI Process • Several teleconferences per phase • Regular e-mail with attachments • Multiple iterations of draft deliverable documents for review and approval • ITI’s workplan, processes and methodologies also reviewed by TRT • Thorough and detailed documentation delivered • Learning experience

  20. Trns•port Test Suite Project Tony Compton, Texas DOT Charles Engelke, Info Tech

More Related