1 / 27

Kanbay Incorporated

Kanbay Incorporated. Testing a Large-Scale Software Re-engineering Project using an Onsite/Offshore Model. Re-engineering Project. Tax and Financial Services Application Large and Complex modules 20+ years old Maintenance issues Time factors Risk Factors

koto
Download Presentation

Kanbay Incorporated

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Kanbay Incorporated Testing a Large-Scale Software Re-engineering Project using an Onsite/Offshore Model

  2. Re-engineering Project • Tax and Financial Services Application • Large and Complex modules • 20+ years old • Maintenance issues • Time factors • Risk Factors • Concurrent Development Conflicts • Increased Testing Requirements • Possible Data Issues

  3. Tax Engine Restructure Project • Tax Engine Restructure Project • Generate Program Complexity Metrics • Identify Programs that need Cleanup, Restructure and Modularization (CRM) • Automate the CRM process • Perform White-box testing • Understand Test Coverage • Technology : • Cobol, JCL, IDEAL and CA-Datacom/DB

  4. Project Scope Defined • Scope • Modules changed after Jan 2004 • Modules identified through Metrics generated by ASG Tools • Total Modules = 1133 • Cleanup required for 1133 Modules • Restructure required for 149 Modules • Modularization required for 323 Modules

  5. Testing Due Diligence • Two-week Time Period to conduct Interviews and Review Documentation • Obtain information to optimize testing timeline and effort. • Identify items that could affect project in either a positive or negative manner • Identify any current Testing practices in use at site that can be used to expedite testing of the Re-engineered code. • Identify any issues that need additional planning or clarification from a testing standpoint. This could involve Documentation issues, Test Data issues, additional Technical issues, etc. • Identify and use Best Practices

  6. Testing Due Diligence Questionnaire • Questionnaire Sections with sample questions • General: Identify the age of the Application • Technical: Identify data bases used. Examples: DB2, Oracle, etc. • Documentation: Identify whether System Flow Diagrams or Job Stream Flow Diagrams are available. • Activity: Identify the number of Maintenance requests per month. • Application Support Systems: How are user-Ids requested and authorized for access to servers, databases and other systems? • Total of approximately 60 questions

  7. Engagement Objectives • Repeatable processes • Best practices • Automation tools and techniques • Cost effective delivery model

  8. Kanbay and ASG Solution for Tax Restructure Project Testing Goals • Tax Restructure Project Goals • Reduce application maintenance costs • Sustain competitive advantage • Reassume application ownership • Testing Objectives • Outsource testing effort • Common solution approach • Use ASG tools • Repeatable Processes • Deliver by Feb 9th • Testing Challenges • Efficient and repeatable process • Collaborative QA + Development Approach • Regression testing • Large Volume of Test Data

  9. ADP Due Diligence was performed to identify current practices that can be used on the Tax Restructure Project as well as to identify gaps that need a new approach. • Focus Areas • Repeatable processes • Use of ASG tools • Development/QA shared responsibilities • Automate wherever possible

  10. Solution Differentiators • Testing solution approach for the Tax Restructure project • Scalability and flexibility of solution • Lowest cost per test deliverable • Expertise in implementing Testing Solutions to large Financial Institutions • Partnership with tools vendor

  11. Testing Methodology and Project Management • Testing Methodology and Project Management • Mix of white-box and black-box testing • Regression Testing using Baseline code • Robust change control to minimize and ensure accuracy of retrofits • Test coverage is ensured using tools and techniques - ASG-SmartTest and ASG-SmartTest TCA • Exploring the use of Smartware to identify pair-wise test data • Exploring the use of ASG-StarTool for editing Test data • Test environment for various types and levels of testing will be established by leveraging Client’s Technical Support • Involve all stakeholders to facilitate defect management in terms of process, workflow, and reporting • All test artifacts, plans, deliverables are signed off by client • Metrics providing transparence in to the quality of application/product are published at regular intervals • Centralized management, monitoring, and reporting procedures will be jointly established with client.

  12. QA Verification Process

  13. Testing for Restructured/Modularized Code

  14. Roles and Responsibilities – Test Manager • Test Manager • Providing test support to the project as a whole • Manage cost, time, quality, risk, scope and resources • Review test strategy / Project plan and Final summary report. • High level effort estimation and review of detailed estimation • High level scheduling and work assignments • Review and publish Test Metrics • Timely escalations of issues with the client

  15. Roles and Responsibilities – Test Lead • Test Strategy / Test plan preparation • Task allocation and monitoring • Co-ordination and management of the test team • Review of Test conditions and test cases • Communication with client and query resolution. • Preparation of test data guidelines and requirements • Reviewing and guiding test engineers in test execution • Preparation and maintenance of test-ware (Test Cases, Tests Data, Test Stubs/Harnesses) in auditable form • Planning and validating tests • Identifying re-tests • Defect tracking and analysis • Test metrics collation • Preparation of final summary report

  16. Roles and Responsibilities - QA Tech Support • Build test environment • Extracts data for files and databases • Creates, Modifies and Executes JCL • Troubleshoots Job related problems • Promotion and Demotion of programs within Changeman • Setup jobs in scheduler (Control-M) • Backup and Restore data from files and tables

  17. Roles and Responsibilities – QA Testers • Prepares test cases and test conditions • Executes Test Cases and Test Scripts • Maintains test execution logs. • Logs incidents • Reports Results • Logs individual issues and downtime • Produce and Analyze Test Coverage Analysis reports in the Baseline environment • Collaboration testing with the developers

  18. Phase 1 Staff Planning • Phase 1: 06 MAR 2006 through 30 JUN 2006 • QA verification for 330 cleanup-only programs: • 8 hours per program • 8 hours times 330 programs = 2640 hours for Phase 1 QA verifications • 2640 hours divided by (17 weeks times 40 hours) = 4 QA Testers offshore • ------------------------------------------------------------------------------------------------------------ • Testing for re-engineered code (30 programs) • Estimate 2 programs per job (proc). • 15 jobs * 100 hours for Baseline setup of job including determining input data = 1500 hours • 30 programs * 40 hours for Baseline execution, test coverage, saving files = 1200 hours • 30 programs * 40 hours for Re-engineered execution and comparison to Baseline, retest fixes = 1200 hours • 1500 + 1200 + 1200 = 3900 hrs • 3900 hours divided by (17 weeks times 40 hours) = 6 QA Testers •  ----------------------------------------------------------------------------------------------------------- • QA/Tech Support staffed at 1 person per environment = 4 QA/Tech Support personnel • QA Team Lead staffed at 1 person per 10 QA/Testers. • QA Manager staffed at 1 person per 15 QA/Test personnel. • -------------------------------------------------------------------------------------------------------------------------------- • Phase 1 will be used to validate estimates, look for ways of doing things better and faster (continuous improvement), and help determine if more testers can be located offshore for a cost benefit to the client for future phases.

  19. Project Status as of 05/22/2006 Project Sta • 269 programs in production • Cleaned-up 381 programs • Restructured 1 program • Modularized 47 programs • QA verified 349 programs

  20. Updated Plan for Fiscal 2006 – 391 cleanups, 5 restructure, 62 modularizations March 06 April 06 June 06 May 06 Mile Stone 1 : 03/31/06 • Dev: Clean Ups - 71 • Modularize - 12 • QA: Verifications – 51 Mile Stone 2: 05/05/06 • Dev: Clean Ups - 130 • Modularize - 15 • Restructure - 2 • QA: Verifications – 150 Mile Stone 3: 05/30/06 • Dev: Clean Ups - 100 • Modularize – 14 • Restructure – 1 • QA: Verifications - 100 • Testing - 5 Mile Stone 4 : 06/30/06 • Dev: Clean Ups - 76 • Modularization – 22 • Restructure – 2 • QA: Verifications – 76 • Testing - 20

  21. Cleanup Verification Highlights •  Unique QA Verification Process Introduced using ASG tools • Standardized Definition of "Cleaned Up Only" Code • Successfully completed Proof of Concept for QA Verification Process •  Detailed QA Verification process implemented and refined •  Ahead of schedule: • Planned: 301 and Actual: 349 •  4 problems reported Post-Production from 269 programs •  1.5% defect rate (Outstanding percentage for QA Testing) •  Reduced time from QA to Production by at least a factor of 10

  22. 1. Tests are driven by selection and use of data. 2. Data is selected for testing based upon previous ADP experience with the help of SMEs 3. Baseline is created and used throughout for comparison purposes. • 1. Generate initial TCA reports using baseline code • 2. Review TCA report with SMEs for adequate coverage • Add/change data and parms to improve coverage • 4. Re-run baseline tests • 5. Get SME approval on test coverage • Run the modularized code using baselined input files • 100% match on output files and updated tables • Verify all new COBOL statements in modularized code are executed using TCA • Verify any un-executed code is the same between baseline and modularized code Data Driven Verification of Test Coverage Execute After Test Enhanced Test Execution Methodology

  23. Pros and Cons of Approach • Pros: • 1. More Effective Testing using Integrated Approach • * File/Database Comparisons to Baseline (Black-box) • * TCA Reports used in Establishing Baseline • * QA Reviews of ‘after test’ Results at code level (White Box) • 2. Adds more certainty to “code not covered” by data • 3. Reduces risk moving to Production • Cons: • Requires Manual testers with knowledge of COBOL • Requires testers to have TCA familiarity at COBOL statement level

  24. Project Status as of 09/01/2006 • 647 Cleaned-upprograms in production • 29 Modularized programs in production • Cleaned-up 702 programs • Modularized 41 programs • QA verified 690 programs • QA tested 32 programs Note: The statistics reported above are for a rolling four-week period. Totals till date represent number of modules completed since the beginning of the project.

  25. Development & QA Effectiveness Metrics

  26. Net Results • Near Zero Defect Rate • Satisfied Client • Increased Testing Presence • E-Filing Projects (Testing new development) • Proposal for Regression Testing Partnership • SAP Testing Assessment • Win-Win Outcome

  27. Kanbay WORLDWIDE HEADQUARTERS 6400 SHAFER COURT ROSEMONT, ILLINOIS USA 60018 Tel. 847.384.6100 Fax 847.384.0500 WWW.KANBAY.COM

More Related