1 / 64

As Simple As Possible, But No Simpler

As Simple As Possible, But No Simpler. Sam Guckenheimer http://lab.msdn.microsoft.com/vs2005/teamsystem/ samgu@microsoft.com. Quality. Functionality. Resources. Time. Simple Project Management. “The Iron Triangle” (err… tetrahedron). 21 st Century Mantra. Do more with less!

yamka
Download Presentation

As Simple As Possible, But No Simpler

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. As Simple As Possible,But No Simpler Sam Guckenheimer http://lab.msdn.microsoft.com/vs2005/teamsystem/ samgu@microsoft.com

  2. Quality Functionality Resources Time Simple Project Management “The Iron Triangle” (err… tetrahedron)

  3. 21st Century Mantra Do more with less! • But if your only variables are: • Functionality • Quality • Resources • Time • …then how are you going to do that?

  4. An Older Truth Все счастливые семьи похожи друг на друга, каждая несчастливая семья несчастлива по-своему. Happy families are all alike; every unhappy family is unhappy in its own way. Tolstoy, Anna Karenina

  5. 13 Symptoms of Unhappiness • It’s the code, stupid! • Actually it’s the requirements! • No, the problem is that you neglected the architecture! • Architecture, schmarchitecture. I just want a working build. • What good is that the way we mix up versions?! • Not code versions, but the environments, don’t you get it? • Ever heard of security?! • Yeah, but you ignored performance, duh! • So what if it worked in the lab -- it’s still unmanageable! • Oh, and did we mention testing? • Since you’re not measuring it, you can’t manage it anyway! • With a process like that, what do you expect? • It’s our culture – you’ll never change that.

  6. 13 Symptoms of Unhappiness • It’s the code, stupid! • Actually it’s the requirements! • No, the problem is that you neglected the architecture! • Architecture, schmarchitecture. I just want a working build. • What good is that the way we mix up versions?! • Not code versions, but the environments, don’t you get it? • Ever heard of security?! • Yeah, but you ignored performance, duh! • So what if it worked in the lab -- it’s still unmanageable! • Oh, and did we mention testing? • Since you’re not measuring it, you can’t manage it anyway! • With a process like that, what do you expect? • It’s our culture – you’ll never change that.

  7. Some why-nots Use managed code Use modern frameworks Use service-oriented architecture Use available tools Transparency Responsible costing Visible results Available tools Unit tests Code coverage Static analysis Profiling performance Source control Work item tracking Build automation Code

  8. Unit Tests and Code Coverage CodeUnder Test not covered during the test run Unit Test Results

  9. Code Analysis Direct jump to code from the warning Code Analysis recommendations as build warnings http://blogs.msdn.com/jason_anderson/archive/2004/09/05/225798.aspx

  10. 13 Symptoms of Unhappiness • It’s the code, stupid! • Actually it’s the requirements! • No, the problem is that you neglected the architecture! • Architecture, schmarchitecture. I just want a working build. • What good is that the way we mix up versions?! • Not code versions, but the environments, don’t you get it? • Ever heard of security?! • Yeah, but you ignored performance, duh! • So what if it worked in the lab -- it’s still unmanageable! • Oh, and did we mention testing? • Since you’re not measuring it, you can’t manage it anyway! • With a process like that, what do you expect? • It’s our culture – you’ll never change that.

  11. Product Definition • Personas and Scenarios • Qualities of Service • Capture implicit requirements • Kano analysis • Stack ranking Continually challenge your assumptions!

  12. Personas and Scenarios CEO Signs Contract PM Starts New Portfolio Project PM Enumerates Requirements in Excel PM Schedules Work in MS Project Architect Updates Design Architect Adds Tasks & Checks In Dev Writes Code Dev Writes & Runts Unit Tests Dev Reviews Work Dev Runs Code Analysis Dev Writes Load Tests Dev Checks In Work PM Monitors Project Status Tester Checks Build Status Tester Runs Load Test Tester Reports Bug PM Reviews Project Status PM Promotes For Deployment Dev Diagnoses & Fixes Dev Checks In Work PROJECT MANAGEMENT ARCHITECT DEVELOPER TEST

  13. Performance Responsiveness Concurrency Efficiency Fault tolerance Scalability Trustworthiness Security Privacy Conformance to standards Interoperability Usability Accessibility Attractiveness Compatibility Discoverability Ease of use Localizability Manageability Availability Reliability Installability and uninstallability Maintainability Monitorability Recoverability Testability Supportability Qualities of Service

  14. Kano Analysis Hinshitsu (Quality), The Journal of the Japanese Society for Quality Control, XIV:2, pp.39-48, April 1984

  15. Challenging Assumptions Customer’s desktop Customer in usability lab

  16. 13 Symptoms of Unhappiness • It’s the code, stupid! • Actually it’s the requirements! • No, the problem is that you neglected the architecture! • Architecture, schmarchitecture. I just want a working build. • What good is that the way we mix up versions?! • Not code versions, but the environments, don’t you get it? • Ever heard of security?! • Yeah, but you ignored performance, duh! • So what if it worked in the lab -- it’s still unmanageable! • Oh, and did we mention testing? • Since you’re not measuring it, you can’t manage it anyway! • With a process like that, what do you expect? • It’s our culture – you’ll never change that.

  17. Architecture • Service-Oriented Architecture • Infrastructure Architecture • Legacy

  18. Service Orientation Build systems using autonomous services that adhere to the four tenets of Service Orientation: • Boundaries are explicit • Services are autonomous • Services share schema and contract, not class • Service compatibility is determined based on policy http://msdn.microsoft.com/msdnmag/issues/04/01/Indigo/default.aspx

  19. Application Designer Service-OrientedArchitecture model Port Details editor

  20. Infrastructure Architecture • Points of Failure • Points of Observation • Points of Attack • Manageability

  21. Logical Infrastructure Designer Services assignedto logical infrastructure Architecture validatedagainst operationalsettings and constraints

  22. 13 Symptoms of Unhappiness • It’s the code, stupid! • Actually it’s the requirements! • No, the problem is that you neglected the architecture! • Architecture, schmarchitecture. I just want a working build. • What good is that the way we mix up versions?! • Not code versions, but the environments, don’t you get it? • Ever heard of security?! • Yeah, but you ignored performance, duh! • So what if it worked in the lab -- it’s still unmanageable! • Oh, and did we mention testing? • Since you’re not measuring it, you can’t manage it anyway! • With a process like that, what do you expect? • It’s our culture – you’ll never change that.

  23. Build Automation • Nightly build • Project heartbeat • Pre check-in tests • Validation of code prior against current base prior to check-in • Variant is continuous integration • Build verification tests • Functional tests (from unit tests) • Component integration tests • Build reporting • Against backlog, by check-in/changeset

  24. Build Reporting

  25. 13 Symptoms of Unhappiness • It’s the code, stupid! • Actually it’s the requirements! • No, the problem is that you neglected the architecture! • Architecture, schmarchitecture. I just want a working build. • What good is that the way we mix up versions?! • Not code versions, but the environments, don’t you get it? • Ever heard of security?! • Yeah, but you ignored performance, duh! • So what if it worked in the lab -- it’s still unmanageable! • Oh, and did we mention testing? • Since you’re not measuring it, you can’t manage it anyway! • With a process like that, what do you expect? • It’s our culture – you’ll never change that.

  26. Versions • Track versions for each of • Source • Tests • Executables and other runtimes you create • XML, HTML, images, docs & databases • Environmental/deployment components • Bugs • Report them together & relate them

  27. 13 Symptoms of Unhappiness • It’s the code, stupid! • Actually it’s the requirements! • No, the problem is that you neglected the architecture! • Architecture, schmarchitecture. I just want a working build. • What good is that the way we mix up versions?! • Not code versions, but the environments, don’t you get it? • Ever heard of security?! • Yeah, but you ignored performance, duh! • So what if it worked in the lab -- it’s still unmanageable! • Oh, and did we mention testing? • Since you’re not measuring it, you can’t manage it anyway! • With a process like that, what do you expect? • It’s our culture – you’ll never change that.

  28. Environment • Production environment • Test environment • Capturing environment • Tools • Microsoft Virtual PC • Microsoft Virtual Server • Maintain lab images

  29. 13 Symptoms of Unhappiness • It’s the code, stupid! • Actually it’s the requirements! • No, the problem is that you neglected the architecture! • Architecture, schmarchitecture. I just want a working build. • What good is that the way we mix up versions?! • Not code versions, but the environments, don’t you get it? • Ever heard of security?! • Yeah, but you ignored performance, duh! • So what if it worked in the lab -- it’s still unmanageable! • Oh, and did we mention testing? • Since you’re not measuring it, you can’t manage it anyway! • With a process like that, what do you expect? • It’s our culture – you’ll never change that.

  30. Security • The core problem • Threat modeling • Code analysis • Security testing Michael Howard, Writing Secure Code, 2003J.D. Meier et al., Improving Web Application Security, 2003

  31. Security: Core Problem • Odds of securing a single level is 1 / ∞ • Bad guy has to find only one vulnerability • Infinite time • Microsoft as example • 100’s of different IT environments • 2,500 unique attacks per day • 125,000 incoming virus-infected e-mails per month • Need to secure at every level • Design • Default • Deployment • Multiple layers of defense needed

  32. Threat Modeling • Analyze the design for vulnerability • Model data flows • S - Spoofing Identity • T - Tampering with Data • R - Repudiation • I - Information Disclosure • D - Denial of Service • E - Elevation of Privilege

  33. 13 Symptoms of Unhappiness • It’s the code, stupid! • Actually it’s the requirements! • No, the problem is that you neglected the architecture! • Architecture, schmarchitecture. I just want a working build. • What good is that the way we mix up versions?! • Not code versions, but the environments, don’t you get it? • Ever heard of security?! • Yeah, but you ignored performance, duh! • So what if it worked in the lab -- it’s still unmanageable! • Oh, and did we mention testing? • Since you’re not measuring it, you can’t manage it anyway! • With a process like that, what do you expect? • It’s our culture – you’ll never change that.

  34. Performance • Deployment configuration • Model performance as part of product definition • Replicate environment in lab • Test it as part of development • Fix it where it hurts • Three-tiered problem • System • Components • Code

  35. System and Component Alerts and warnings on Systems Under Test Performance measures of test and Systems Under Test

  36. Code Performance Timeline of memory consumption Suspect functions, drillable to code

  37. 13 Symptoms of Unhappiness • It’s the code, stupid! • Actually it’s the requirements! • No, the problem is that you neglected the architecture! • Architecture, schmarchitecture. I just want a working build. • What good is that the way we mix up versions?! • Not code versions, but the environments, don’t you get it? • Ever heard of security?! • Yeah, but you ignored performance, duh! • So what if it worked in the lab -- it’s still unmanageable! • Oh, and did we mention testing? • Since you’re not measuring it, you can’t manage it anyway! • With a process like that, what do you expect? • It’s our culture – you’ll never change that.

  38. Manageability • Operations documented and current for every service or application • Service level agreement in place • Security scanning in place • Proactively monitor and fix • Reactive and proactive problem management

  39. 13 Symptoms of Unhappiness • It’s the code, stupid! • Actually it’s the requirements! • No, the problem is that you neglected the architecture! • Architecture, schmarchitecture. I just want a working build. • What good is that the way we mix up versions?! • Not code versions, but the environments, don’t you get it? • Ever heard of security?! • Yeah, but you ignored performance, duh! • So what if it worked in the lab -- it’s still unmanageable! • Oh, and did we mention testing? • Since you’re not measuring it, you can’t manage it anyway! • With a process like that, what do you expect? • It’s our culture – you’ll never change that.

  40. Testing Mission & Approach • Marick’s Framework • Different missions and approaches apply for each quadrant http://www.testing.com/cgi-bin/blog/2003/08/21#agile-testing-project-1

  41. Let the punishment fit the crime! Gilbert & Sullivan, The Mikado • A good test approach is: • Diversified • Risk-focused • Product-specific • Practical • Defensible • Fit the technique and its data to its purpose in the quadrant Kaner, Bach & Pettichord, Lessons Learned in Software Testing, 2002

  42. Testing Mission & Approach Representative techniques

  43. Test Coverage • Identify the Scenario, QoS or Code that the test tests • If they’re newly discovered, capture them • If you can’t name them, question the value of the test • Measure coverage against these dimensions

  44. Test Automation and Its Discontents Σt (Value of Information) - Σt (Cost to Maintain) Σt (Cost to Implement) ROI= (adjusted for net present value and risk)

  45. Test Automation and Its Discontents Σt (Value of Information) - Σt (Cost to Maintain) Σt (Cost to Implement) • Value depends on context • Automation is a programming exercise • Opportunity cost high due to resource constraints • Options theory problem • Very sensitive to volatility • Often incalculable ROI=

  46. Testing Web Applications Data substitution Performance breakdown Content validation http request & response View of content as rendered

  47. 13 Symptoms of Unhappiness • It’s the code, stupid! • Actually it’s the requirements! • No, the problem is that you neglected the architecture! • Architecture, schmarchitecture. I just want a working build. • What good is that the way we mix up versions?! • Not code versions, but the environments, don’t you get it? • Ever heard of security?! • Yeah, but you ignored performance, duh! • So what if it worked in the lab -- it’s still unmanageable! • Oh, and did we mention testing? • Since you’re not measuring it, you can’t manage it anyway! • With a process like that, what do you expect? • It’s our culture – you’ll never change that.

  48. Metrics • Consider many dimensions at once • Single metrics easily mislead • Test results • Bug rates • Code churn • Code coverage • Requirements coverage • Never use metrics for reward or punishment • Flow of value, not completion of tasks • Planned and unplanned work Robert Austin, Measuring and Managing Performance In Organizations, 1996

  49. Which Component is Healthiest? • Contrast two views of project data Fewest bugs Highest test pass rate

  50. Which Component is Healthiest? Lowest code coverage • Conclusions: • Tests are stale • Highest risk here Highest code churn

More Related