1 / 1

Objective In which source files are undetected bugs most dangerous?

Runs successfully. Results affected (R c ). Results not affected. Given one bug in one file. Fail to complete (F c ).

truong
Download Presentation

Objective In which source files are undetected bugs most dangerous?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Runs successfully Results affected (Rc) Results not affected Given one bug in one file Fail to complete (Fc) How Well does MM5 Resist Software Defects?Dongping Xu1, Daniel Berleant1,3, Gene Takle1,4, Zaitao Pan21Iowa State University, 2St. Louis University, 3berleant@iastate.edu, 4gstakle@iastate.edu Introduction:We investigate the impact of bugs in MM5. In Study #1, different source files were compared to see which are most susceptible to undetected bugs. In Study #2, we compare the effects of bugs on sensitivity analysis to their effects on forecasting. The findings help fill a gap in knowledge about the dependability of MM5, leading to both new understanding and further questions. Study #1: Effects of Bugs on Forecasts Method Use mutation Analysis to statistically understand the effects of bugs on MM5. Tested 12 common bug types, 13 source files, and 10,893 1-bug mutations. Procedure for applying bugs to source code: Classify mutations into 3 categories: Calculating dependability of source file c: Rc=# of mutations in the “results affected” category Fc=# of mutations in the “Fail to complete” category. Dependability metric Dc=Fc/(Rc+Fc) LowDc for a file suggests a need for extra care in testing and debugging that file. Motivating Question: How well can MM5 run despite software defects? Evaluation: wepicked a representative subset of final outputs for analysis. Raw data for the effects of several thousand bugs on different source code files: Objective • In which source files are undetected bugs most dangerous? • Which files are most important to debug carefully? A change in any of these counts as a change in simulation results. Results Values of Dcfor a number of important files in MM5: Of the files tested, bugs in exmoiss.F, hadv.F, init.f, mrfpbl.F, param.f, and vadv.F are more likely than bugs in the others to have insidious, rather than obvious effects. Discussion and Future Work • The amount of change that a bug causes (not just if it changed or not) should also be analyzed. • Could differences observed across source code files be due in part to differences in the bugs applied rather than the files themselves? • How well do the effects of the tested bugs reflect the effects of real bugs caused by programmers? • A limitation of the study is its reliance on a single weather forecasting scenario. How general are the results across other scenarios? Study #2: Comparing Effects on Forecasts to Effects on Sensitivity Analyses Motivation Sensitivity Analysis:How much perturbing initial conditions perturbs outputs. • Sensitivity analyses answer questions like “if we change CO2 output by ∆, how much will that affect global warming?” • Ensemble forecasting provides understanding by perturbing initial conditions. Method • The set of 24-hour forecasts produced by mutated variants of MM5 for a region of the U.S. midwest with a time step of 4 minutes forms a typical usage scenario. • 10,893 mutations*8 output parameters=87,144 data points. • Perturbation to input conditions: some variables (prognostic 3D variables: UA, UB, VA, VB, TA, TB, QVA, QVB) in file init.f were changed by 0.0001%. Discussion and Future Work • What are the wider implications? • Would different input perturbations lead to significantly different results? • Would different input scenarios lead to significantly different results? • Would results generalize to climate change forecastingvs. prediction of theeffects of changesin CO2production? (Graphic from: www.stalbans.gov.uk/ living/energy/global-warming.gif) Objective Answer the question, “Does sensitivity analysis resist bugs better than point prediction?” • sensitivity of original, unmutated software: S=∆O/∆I (O is output, I is input) • sensitivity of software as modified by mutation m: Sm=∆Om/∆I Magnitude of mutation m’s Magnitude of mutation m’s effect on forecasting (Fm)effect on Sensitivity (Sm) Definition Fm=|Om-O|/OSm=|Sm-S|/S RelationFm>Sm Fm<SmSuggests the opposite. Results • Most of the 87,144 data points were unaffected by the mutation. • Sensitivity analysis was more affected than forecasting (i.e. Fm<Sm) for 12,512 data points. • Sensitivity analysis was less affected (i.e. Fm>Sm) for 323 data points. Hence – • MM5 resists bugs better for forecasting than for sensitivity analyses. References: [1] Berleant, D. and B. Liu. Is sensitivity analysis more fault tolerant than point prediction? Simulation in the Medical Sciences: Proceedings of the 1997 Western MultiConference, The Society for Computer Simulation International, pp. 196-199.  [2] Voas, J. and J. Payne (2000). Dependability certification of software components, Journal of Systems and Software52 (2-3): 165-172. Suggests MM5 resists the deleterious effects of bugs on sensitivity analysis better than it resists their effects on forecasting.

More Related