1 / 21

Testing and Validation of the NPACKage

Testing and Validation of the NPACKage. NPACI All-Hands Meeting March 19, 2003 Marty Humphrey Assistant Professor Computer Science Department University of Virginia. NPACKage Project Structure. Participation across the partnership UCSB, SDSC, UVA, ISI Dedicated staff at each site

mandel
Download Presentation

Testing and Validation of the NPACKage

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Testing and Validation of the NPACKage NPACI All-Hands Meeting March 19, 2003 Marty Humphrey Assistant Professor Computer Science Department University of Virginia

  2. NPACKage Project Structure • Participation across the partnership • UCSB, SDSC, UVA, ISI • Dedicated staff at each site • Software contributors • Activities focused under current NPACI allocation • NPACKage specific activities • Packaging team • Builds team • Testing team • Resource leads • Responsible for local deployment • User Services • Executive Committee • Frequent review of progress

  3. What’s So Tough about Software Test? • Software has lousy quality historically • Simple program can have huge space of execution • “Testing is not innovation but rather verification”

  4. Why Does Software Have Bugs? • miscommunication • software complexity • programming errors • changing requirements • time pressures • poorly documented code • bugs in software development tools

  5. Basics of Software Testing • Unsatisfied goals • Find cases where the program does not do what it is supposed to do • Unwanted side effects • Find cases where the program does things it is not supposed to do

  6. Methodology • Define the expected output or result. • Don't test your own programs. • Inspect the results of each test completely. • Include test cases for invalid or unexpected conditions. • Test the program to see if it does what it is not supposed to do as well as what it is supposed to do. • Avoid disposable test cases unless the program itself is disposable. • Do not plan tests assuming that no errors will be found. • The probability of locating more errors in any one module is directly proportional to the number of errors already found in that module.

  7. Terminology • White box testing • Based on knowledge of internal logic/source code • Black box testing • Not white box testing :^) • Unit testing • To test particular functions or code modules • Integration testing • Testing of combined parts

  8. Terminology (cont.) • Performance / Stress testing • Good for testing scalability • Regression testing • Re-testing after modifications • Acceptance testing • Performed by end-users • Completion criteria • How do you know when you’re done?

  9. Don’t the individual projects test their software? • Yes, of course, but we can do more • Value of centralized testing • Complex interaction tests can be better achieved organizationally, not in isolation • Incomplete testing of any component reflects poorly on the organization (NPACI) • Independent testing can greatly improve software quality

  10. Value Added at UVa • Work closely with other NPACKagers • NPACKage packagers (Larry Miller, UCSB; Bill Link, SDSC) • NPACKage “Builders” and “Testbed Deployers” (Mats Rynge, USC/ISI) • UVa: RH Linux 7.3 (8.0?), AIX 4.3, [Solaris] • Combination of “black box” and “white box” • Use Drivers and Stubs provided by NPACKage contributors • Provide feedback to NPACKage contributors • Goal: Testing Deployment • “Given a Deployment, does it work?” • NPACI Testbed and NPACI Production • Goal: Testing Deployability • Broader impact in community

  11. Software for the NPACKage • NMI • Globus, Condor-G, NWS, GSI-OpenSSH, KX.509, GridConfig, (MyProxy, MPICH-G2) • DataCutter • Storage Resource Broker – SRB • GridPort • Ganglia • APST • GridSolve • LAPACK For Clusters – LFC

  12. Testing of NMI: SURA • UVa: part of SURA NMI Testbed • Globus • Condor-G • NWS • GSI-OpenSSH • UVa co-PI on MyProxy NMI grant

  13. Testing of DataCutter • Last extracted from CVS on March 11, 2003 • Tests from developers a. Make sure the daemons start g. Placement test b. Buffer test h. Layout test c. Instance test i. Cluster test d. Returns test j. Directory test e. Crash test k. Grid array averager f. Endian test • Thanks to Shannon Hastings

  14. Testing of SRB • Just received SRB 2.0.0/2.0.1 early this week • Plan • Exercise unit tests in SRB (awaiting descriptions) • Interactions with other NPACKage components • DataCutter • GridPort • GridFTP • GSI • Many clients, server • Thanks to Reagan Moore, Wayne Schroeder

  15. Testing of GridPort • Last extracted (v 2.2) from CVS on March 14, 2003 • Run test Perl scripts from distribution • We use in our Alpha Project • “Protein Folding on the Grid” (with C. Brooks, M. Crowley of TSRI) • Thanks to Maytal Dahan, Mary Thomas

  16. Testing of Ganglia • Last extracted from CVS on March 11, 2003 • Currently executing on 9 nodes on UVa’s Centurion • Tests • Start a ‘gmond’ and pull its XML tree (“telnet localhost 8649”); verify with “xmllint --verify out.xml“ • Planned • The ganglia-python package from SDSC's Rocks group includes an MDS provider that obtains data from ganglia. • Is the information correct? • Thanks to Federico Sacerdoti, Phil Papadopoulos

  17. NPackage Testbed • Machine contributions from resource sites • Many thanks for the access to the systems for development and testing, especially: • U.Mich • 6 AIX4.3, 1 ia64 RH7.1 • SDSC • Cluster: 4 RH7.3, 2 AIX5.1 • UCSB • 1 SUSE 8.1 • USC/ISI • 1 RH7.2 • Point your favorite LDAP browser at: • giis.npaci.edu:2135

  18. Testing NPACKage Stage-1 • Resource discovery and monitoring infrastructure • Hierarchical discovery and monitoring cache/index (MDS) • Host resource information (MDS) • Cluster resource inform (Ganglia) • Binaries • ia64-redhat-7.1 • power3-aix-4.3 (testing...) • power3-aix-5L • x86-redhat-7.2 • x86-redhat-7.3 (tested) • x86-suse-8.1 • Is the information correct?

  19. Automation • Some tools exist • Dart • Tinderbox / TreeFire (Mats) • “Inca Build System” (TeraGrid) • Issue: Determining Cause/Blame • Builder • Deployer • Packager • Resource Provider • Software Developer

  20. Bottom Line • NPACI software infrastructure is transitioning to production • Evolving plan for testing and validation Providers’ tests  tests at UVa  tests on NPACI testbed  deployment to NPACI sites • Integrate and influence NMI testing and TeraGrid testing • Next: APST v2.0, GridSolve, LFC • NPACKage: infrastructure (and soon) integration with Applications (e.g., CHARMM Portal)

More Related