1 / 19

Verification Practices for Code Development Teams

Verification Practices for Code Development Teams. Greg Weirs Computational Shock and Multiphysics Department Sandia National Laboratories 25 May 2010.

kiora
Download Presentation

Verification Practices for Code Development Teams

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Verification Practices for Code Development Teams Greg Weirs Computational Shock and Multiphysics Department Sandia National Laboratories 25 May 2010 Sandia is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Company, for the United States Department of Energy’s National Nuclear Security Administration under contract DE-AC04-94AL85000. Approved for unlimited release as SAND 2010-3325 C

  2. My Experience • Computational Fluid Dynamics (CFD) of reacting flows – aerospace engineering • U. of Chicago FLASH code – astrophysics • Sandia ALEGRA code – Solid Dynamics + Magneto-Hydro-Dynamics (MHD). Most examples I will show are from ALEGRA. Your mileage may vary.

  3. Expectation of Quality Expectations of the accuracy of scientific simulations vary. Who are you trying to convince? • My house • My job • The company • Your house • Some money • I’d bet X on the result; X= • Uncertainty Quantification • Error bars on simulation results • Result converges with refinement • Mesh refinement • Eyeball norm • Trends are reasonable • Result is plausible • Result is not ridiculous • Code returns a result Code developers Analysts The customer The public

  4. Context What makes engineering physics modeling and simulation software different? • Our simulations provide approximate solutions to problems for which we do not know the exact solution. This leads to two more questions: • How good are the approximations? • How do you test the software?

  5. What is code verification? Code Verification SQE How good is your code? Algorithms (FEM, ALE, AMG, etc.) Implementation (C++, Linux, MPI, etc.) Governing Equations (IDEs) Discrete Equations Numerical Solutions How can the algorithm be improved? Has the algorithm been correctly implemented?

  6. What else do I need? Code Verification SQE SA/UQ Validation Solution Verification How good is your simulation? How good is your code? Are these equations adequate? How large is the numerical error? Governing Equations (IDEs) Discrete Equations Numerical Solutions

  7. Uncertain Outputs from Uncertain Inputs Inputs: parameters to governing equations, algorithms, and discrete equations Uncertainty Quantification and Code Verification are nearly orthogonal Governing Equations (PDEs) Discrete Equations Outputs: metrics of interest Numerical Solutions

  8. Informal Definitions An ingredients list for predictive simulation, not a menu.

  9. ALEGRA Test Suite Unit tests Verification Performance (Scaling) Regression Validation Prototypes SA/UQ SQE Apps

  10. The Gauntlet “Prototype” simulations contributed by ALEGRA users • Have no exact solution • User-defined success criteria for each case • Might represent a capability we want to maintain • Might agitate our current capabilities The Usual Suspects…

  11. The Production Line Over time, the ALEGRA code team has developed a software infrastructure to support testing • Ability to run a large number of simulations repeatedly • Ability to compare results with a baseline • Ability to report pass/diff/fail • Run on platforms of interest at regular intervals

  12. The Production Line Code verification tests further require: • Ability to handle different types of reference solutions • Ability to compute error norms and convergence rates • Flexibility for computing specialized metrics for specific test problems *Some of these capabilities are helpful for solution verification

  13. Verification Is Not Free Principal Costs: • Infrastructure development • Test development Recurring Costs – A tax on development: • Maintenance of existing tests • Code development becomes a very deliberate process

  14. Verification As A Continuous Process • To set up a verification problem once takes significant effort – steep learning curve, infrastructure is not in place • Running a verification analysis you have maintained takes minimal work • Without regular, automated verification testing, verification results go stale quickly - they do not reflect the current state of the code

  15. Code Verification Identifies Algorithmic Weaknesses One purpose of code verification is to find bugs. • Code verification often finds bugs that are subtle and otherwise difficult to identify. • The eyeball norm finds most obvious bugs quickly. Perhaps a better use of code verification is to guide code development. • Some bugs are algorithmic and conceptual. • Code verification identifies algorithmic weaknesses. • Large errors are a weakness.

  16. Code Developers Do Code Verification • Code developers best understand the numerical methods they are using • Code developers are best able to use the results of code verification (and other forms of assessment) to improve the algorithms they use • Code verification as an accreditation exercise has no lasting impact on code quality

  17. Verification Testing Must Be a Team Ethic • Discipline is required to keep a “clean” test suite – to keep all tests passing; “stop-the-line” mentality • If only part of the team values verification, that part is always carrying the other developers • Maintaining an automated verification test suite is probably necessary but definitely not sufficient • Developers should be using verification tests interactively

  18. Costs In Context • Code developers retain practices that justify their expense. • Consider version control software (CVS, SVN, git…): • Do you remember a time before you used it? • Would you consider going back? Verification should be as critical

  19. Sustainable Verification Sustainable Verification: The benefits of verification outweigh the costs. • Continuous verification is a necessary practice for code quality • Code verification is the foundation for other forms of assessment • Verification guides development

More Related