1 / 49

Software Engineering

Software Engineering. Week 11 INFM 603. Agenda. Software Development Software Process Models Testing Note: Software Engineering material based on slides from cmsc132 ( Dept of CS, UMCP). Modern Software Development. Why do we want to study the software development process? To understand

missy
Download Presentation

Software Engineering

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Software Engineering Week 11 INFM 603

  2. Agenda • Software Development • Software Process Models • Testing • Note: Software Engineering material based on slides from cmsc132 (Dept of CS, UMCP)

  3. Modern Software Development • Why do we want to study the software development process? • To understand • Software development problems • Why software projects fail • Impact of software failures • How to develop better software

  4. Software Engineering • Definition from Wikipedia • Field that creates and maintains software applications by applying technologies and practices from computer science, project management, engineering, application domains, and other fields

  5. Software Development Problems • Software is • Expensive • Cost per line of code growing (unlike hardware) • Frequently late • Schedule overruns • Example: ARIS (Achievement Reporting and Innovation System) is an $80 million data and information system for New York City public schools which was not ready by Sept 2008 (due date). Complete article at: • http://www.nytimes.com/2008/10/24/education/24aris.html

  6. Software Development Problems • Software is • More expensive than projected • Cost overruns • Difficult to use & understand • Missing features • Too slow

  7. Software Projects Fail • Anywhere from 25-50% of custom software fails • Example (FBI Virtual Case File) • Began Jan 2001 • Officially scrapped Jan 2005 • LA Times (Jan 13, 2005) • “A new FBI computer program designed to help agents share information to ward off terrorist attacks may have to be scrapped… Sources said about $100 million would be essentially lost if the FBI were to scrap the software…”

  8. Software Projects Fail • Reasons for failure of FBI Virtual Case File • Poor specification • 800-page requirement document • Repeated changes in specification • New requirements continually added • Poor management • Repeated management turnover • Micromanagement of software developers • FBI personnel with no software experience

  9. Impact of Software Failures Increasing • Software becoming part of basic infrastructure • Software in cars, appliances • Business transactions moving online • Computers becoming increasingly connected • Failures can propagate through internet • Internet worms • Failures can be exploited by others • Viruses • Spyware

  10. Software Contributes to Real Failures • Bugs in software may cause real-world failures • Example – Air Force F-22A Raptor • Stealth fighter costing $300 million each • 1.7 millions lines of code for plane’s avionics

  11. Software Contributes to Real Failures • Air Force F-22A Raptor software fails midair • DefenseNews.com (March 5, 2007) • “When a dozen Raptors en route from Hawaii to Japan crossed the International Date Line for the first time, the jets’ Global Positioning System navigation avionics went haywire, forcing the pilots to turn around.” • GPS software unable to handle change in longitude from W179.99o to E180 • Raptor pilots visually followed refueling tankers back to Hawaii

  12. Software Contributes to Real Failures • Happy ending for Raptor? • Lockheed-Martin provided software fix in 48 hours • For “operational security reasons” the USAF declined to elaborate, saying only that the F-22A “experienced a software problem involving the navigation system” • Tough being a Raptor test pilot • DefenseNews.com (March 5, 2007) • “When the plane was in developmental stages … pilots flying the Raptor would often have to reboot the onboard computers that controlled the jet’s high-end functions”

  13. Other Famous Software Failures • 1990 AT&T long distance calls fail for 9 hours • Wrong location for C break statement • 1996 Ariane rocket explodes on launch • Overflow converting 64-bit float to 16-bit integer • 1999 Mars Climate Orbiter crashes on Mars • Missing conversion of English units to metric units • Other Failures available at: • http://www.sundoginteractive.com/sunblog/posts/top-ten-most-infamous-software-bugs-of-all-time/ • http://www.net-security.org/secworld.php?id=10354

  14. Why Is Software So Difficult? • Complexity • Software becoming much larger • Millions of line of code • Hundreds of developers • Many more interacting pieces • Length of use • Software stays in use longer • Features & requirements change • Data sets increase • Can outlast its creators

  15. Software Life Cycle • Coding is only part of software development • Software engineering requires • Preparation before writing code • Follow-up work after coding is complete • Software life cycle • List of essential operations / tasks • Needed for developing good software • No universal agreement on details

  16. Components of Software Life Cycle • Problem specification • Program design • Algorithms and data structures • Coding and debugging • Testing and verification • Documentation and support • Maintenance

  17. Software Development • Coding is small part of software development • Estimated % of time • 35% Specification, design • 20% Coding, debugging • 30% Testing, reviewing, fixing • 15% Documentation, support

  18. Problem Specification • Goal • Create complete, accurate, and unambiguous statement of problem to be solved • Example • Specification of input & output of program • Problems • Description may be inaccurate or change over time • Difficult to specify behavior for all inputs

  19. Program Design • Goal • Break software into integrated set of components that work together to solve problem specification • Example • Problems • Methods for decomposing problem • How components work together

  20. Algorithms and Data Structures • Goal • Select algorithms and data structures to implement each component • Problems • Functionality • Provides desired abilities • Efficiency • Provides desired performance • Correctness • Provides desired results

  21. Algorithms and Data Structures • Example • Implement list as array or linked list

  22. Coding and Debugging • Goal • Write actual code and ensure code works • Problems • Choosing programming language • Procedural design • Fortran, BASIC, Pascal, C • Object-oriented design • Smalltalk, C++, Java • Using language features • Exceptions, streams, threads

  23. Testing and Verification • Goal • Demonstrate software correctly match specification • Problem • Program verification • Formal proof of correctness • Difficult / impossible for large programs • Empirical testing • Verify using test cases • Unit tests, integration tests, alpha / beta tests • Used in majority of cases in practice

  24. Documentation and Support • Goal • Provide information needed by users and technical maintenance • Problems • User documentation • Help users understand how to use software • Technical documentation • Help coders understand how to modify, maintain software

  25. Maintenance • Goal • Keep software working over time • Problems • Fix errors • Improve features • Meet changing specification • Add new functionality

  26. Software Process Models • Software methodology • Codified set of practices • Repeatable process for producing quality software • Software process model • Methodology for organizing software life cycle • Major approaches • Waterfall model • Iterative development • Unified model • Agile software development • Extreme programming (XP) (Prominent Example) • Formal methods

  27. Waterfall Model • Approach • Perform steps in order • Begin new step only when previous step is complete • Result of each step flow into next step

  28. Waterfall Model • Advantages • Simple • Predictable results (emphasizes predictability) • Software follows specifications • Reasonable for small projects • Problems • In real life • May need to return to previous step • Steps may be more integrated • Steps may occur at same time • Unworkable for large projects

  29. Iterative Software Development • Approach • Iteratively add incremental improvements • Take advantage of what was learned from earlier versions of the system • Use working prototypes to refine specifications

  30. Iterative Software Development • Goals • Emphasize adaptability instead of predictability • Respond to changes in customer requirements • Examples • Unified model • Agile software development • Extreme programming (XP)

  31. Unified Model • Development divided into phases (iterations) • Inception • Elaboration • Construction • Transition • During each phase • Multiple iterations of software development • Development treated as mini-waterfalls • Emphasis gradually shifts from specification to testing

  32. Unified Software Life Cycle Model

  33. Agile Software Development • Agile approach • Based on iterative development • Short iterations (timeboxes) lasting 1- 4 weeks • Working software as principal measure of progress • Produced at end of each iteration • Adds a more people-centric viewpoint • Face-to-face communication preferred • Co-locate programmers, testers, “customers” • Relies on adapting to feedback rather than planning as the primary control mechanism • Less specification & documentation

  34. Agile Methods

  35. Extreme Programming (XP) • Prominent example of Agile methodology • Iterative, adaptive software development • Describes set of day-to-day practices • Followed by managers & programmers • Intended to encourage a set of values • Appropriate for environments with • Small teams • Rapidly-changing requirements

  36. Extreme Programming Values • Communication • Rapidly building & disseminating institutional knowledge among programming team • Simplicity • Implement simplest code needed by customer without emphasis on future versions • Feedback • From testing, team members, customers • Courage • Willingness to rewrite / refactor software to add or change features

  37. Extreme Programming Practices • Pair programming • Pairs of programmers combine software development efforts at one computer • Especially useful for novice programmers • Test-driven development • Tests are designed first, before writing software • Continuous integration • Tests performed throughout development process • On-site customer • Customer available at all times to answer questions

  38. The Rapid Prototyping Model • Goal: explore requirements • Without building the complete product • Start with part of the functionality • That will (hopefully) yield significant insight • Build a prototype • Focus on core functionality, not in efficiency • Use the prototype to refine the requirements • Repeat the process, expanding functionality

  39. Objectives of Rapid Prototyping • Quality • Build systems that satisfy the real requirements by focusing on requirements discovery • Affordability • Minimize development costs by building the right thing the first time • Schedule • Minimize schedule risk by reducing the chance of requirements discovery during coding

  40. Formal Methods • Mathematically-based techniques for • Specification, development, and verification • Software and hardware systems • Intended for high-integrity systems • Safety • Security • Levels • 0 – Informal implementation of formal specifications • 1 – Formal code development & verification • 2 – Theorem prover to ensure correctness

  41. Program Testing • Empirical testing • Test software with selected test cases • More scalable than verification • Test failures frequently indicate software errors • Absence of failures doesn’t prove software correct • If code isn’t exercised by any test, hard to have confidence in it • Even if it has been “formally verified”

  42. Kinds of Testing • Automated testing • The software is tested by a completely automatic process • e.g., jUnit or submit server testing • Can be expensive or difficult to construct, but fairly cheap to repeat • Manual testing • A person uses the software, perhaps guided by a script, and notes bugs • Often easier to conduct than writing test cases, but very expensive to repeat

  43. Types of Testing • Clear box testing • Allowed to examine code • Attempt to improve thoroughness of tests • Black box testing • No knowledge of code • Treat program as “black box” • Test behavior in response to inputs

  44. Testing Terminology • Test case • Individual test • Test suite • Collection of test cases • Test harness • Program that executes a series of test cases • Test framework • Software that facilitates writing & running tests • Example JUnit • Lets see an example using Java • Regression Testing

  45. Why Regression Test? • Running regression tests give developer much more freedom to change existing code • “I need to rewrite this component to support new functionality – I wonder if anything might be depending on the details of how it works now?” • This freedom is key to agile development, and important even in more structured development methodologies

  46. Bug Tracking • Dilbert Comic - http://dilbert.com/strips/comic/1995-11-13/ • Even with good processes, (alleged) bugs will still turn up in system-level products, both in development and in deployment • Tools for managing, tracking, performing statistics on such bugs and vulnerabilities essential, particularly on large projects • Tools • Bugzilla http://www.bugzilla.org/ • Jira http://www.atlassian.com/software/jira/overview

  47. Bug Counting • How good a metric of software quality is “number of outstanding bugs”? • Are there other reasons you (as a manager) might want to introduce it as a metric? • What would you expect to be the most immediate effect if you introduced it as a metric (and tied programmer appraisal to it)?

  48. Debugging is harder than coding! “Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it” – Brian W. Kernighan and P. J. Plauger, The Elements of Programming

  49. Miscellaneous • Generating documentation using javadoc

More Related