1 / 73

Juval Lowy IDesign idesign

Required Slide. SESSION CODE: ARC201 . A Modular Approach to Development Process . Juval Lowy IDesign www.idesign.net. ©2010 IDesign Inc. All rights reserved . About Juval Löwy. Software architect Consults and trains on .NET architecture

bryce
Download Presentation

Juval Lowy IDesign idesign

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Required Slide SESSION CODE: ARC201 A Modular Approach to Development Process Juval Lowy IDesign www.idesign.net ©2010 IDesign Inc. All rights reserved

  2. About Juval Löwy • Software architect • Consults and trains on .NET architecture • Microsoft's Regional Director for the Silicon Valley • Recent book • Programming WCF Services (2010 O’Reilly) • Participates in the .NET/WCF design reviews • Publishes at MSDN and other magazines • Speaker at the major international development conferences • Recognized Software Legend by Microsoft • Contact at www.idesign.net

  3. Objectives • Describe only the way key process areas are affected by componentizing you system • Today we call these things "services" • Yesterday in was "objects" • Each area has much more to it • Suitable for small teams (<7) • Scaleable ? • Everything described is practiced in real life • Metrics and the charts are normalized projects data

  4. What you want to build:

  5. What is a Module? • Ideally a single WCF service class • Can treat a few interacting services as a single logical service • Each module is perceived a world on its own right

  6. Project Planning • Staffing • Product life cycle • Service life cycle • Services integration plan

  7. Staffing • Is this a good design?

  8. Staffing • Is this a good design?

  9. Staffing • Is this a good design?

  10. Staffing • Is this a good design?

  11. Staffing • Balance number of services with development effort Minimum Cost Cost or Effort Cost to Integrate Cost / Service Number of services

  12. Staffing • Not having a skilled architect is the #1 risk • Rather than the technology itself ! • Requirements analysis and architecture are contemplative time consuming activities • More firepower does not expedite • Single architect usually suffices • In large projects, have a senior architect and a junior/apprentice • Assign a service to individual developer (1:1) • Assembly boundary is team boundary

  13. Staffing • Interaction between team members is isomorphic to interaction between services

  14. Staffing • A good design (minimized interactions, loose coupling, encapsulation) minimizes communication overhead

  15. Staffing Distribution • Get an architect • Architect breaks product into services 5 Construction 4 CM and Sys. Testing 3 Management Staff Architecture 2 Marketing/Product management 1 Jan- Feb- Mar- Apr- May- Jun- Jul- Aug- Sep- Oct- Nov- Calendar Months

  16. Product Life Cycle - Staged Delivery Preparation Requirement & architecture Stage 1 Interface compatible emulator Stage 2 Basic capabilities Stage 3 Setup, logging, configuration editor Stage 4 UI elements Release

  17. Product Life Cycle - Staged Delivery Preparation Requirement & architecture Software Infrastructure Stage 0 TA FileNet Viewing Stage 1 eFiling I Stage 2 – Release 1.0 eFiling II Stage 3 – Release 1.5 Tax Calculations Stage 4 eFiling III Stage 5 – Release 2.0 Batch Headers Stage 6 Surrogate Data Entry Stage 7 – Release 2.5 Release

  18. Services Integration Plan • Derived from services dependency graph • Start bottom-up • Avoids “big-bang” syndrome • Risk reduction oriented • Incompatibility discovered early • Daily builds and smoke tests to test evolving system • Regression if needed • Incrementally build a system • Provide a tested working system at each increment

  19. Services Integration Plan HW 01/12 S1 01/28 DA 05/05 S2 Config 02/01 Command S3 03/22 Queue S4 Setup Handler 04/05 Emulator System 01/13 S5 Client App

  20. Services Integration Plan 11 16 17 18 Critical Path Alternate Path Milestone 12 8 Activities 2 Requirements 13 5 7 10 3 Infrastructure 4 Interface Data Access 5 Interface Manager 6 Security Data Access 7 Security 8 Admin Client 9 Utilities 10 TA FileNet 11 eFiling Client 4 6 9 14 15 12 eFiling Web Service 13 Form Manager 14 Form Data Access 15 Form Engine 16 eFiling Integration 17 System Testing 2 3 18 Release 1.0

  21. Services Integration Plan 29 Critical Path Alternate Path 28 Milestone 23 27 Activities 3 Infrastructure 20 Tax Authority Data Access 21 Tax Authority Data Conversion 22 26 22 Tax Authority Manager 23 Tax Authority Integration 24 Accounts Data Access 25 Accounts Data Conversion 26 Accounts Manager 27 Accounts Integration 28 System Testing 29 Release 1.5 21 20 24 25 3

  22. Service Life Cycle Some Construction SRS Review Detailed Design (standard documentation) STP Design Review Test Client Construction Code Review Integration Testing SRS

  23. Service Testing • EVERY service has its own testing environment • Visible signs of progress to management • Spice up “boring” testing • Test all method calls, call backs and errors (white box) • Fall back to isolate problems • Assumption - no need to test the test SW • System level test SW is provided to customer as well

  24. Service Testing

  25. Estimation and Tracking • Service-based effort estimation • Service-based earned value tracking

  26. There is No Silver Bullet • SOA/WCF projects do not take less than .NET projects • Marginal overall improvementin time to market • Applications are more complex

  27. Service-Based Effort Estimation • Use estimation tools • Team members participate in estimation • Itemize lifecycle of all services • Do not omit: • Learning curves • Test clients • Installation • Integration points • Peer reviews • Documentation

  28. Service-Based Effort Estimation • Both underestimation and overestimation are deadly • Padding allows for increased complexity • Gold plating • Inverted pyramid • Too aggressive schedule guarantee failures • Cutting corners and best practices • Nominal estimation maximizes probability for success

  29. Service Based Effort Estimation

  30. Earned Value Planning • Assign value of work item for the completion of service • Compare earned value (sum of all accomplished activities across services) against effort spent • Can predict completion date and costs

  31. Earned Value - Example

  32. Earned Value - Planning • Build a Gantt chart to derive actual dates • And convert work days to calendar days • Given each activity scheduled date and earned value plot planned progress 100% Planned Completion Date Date

  33. Earned Value - Planning • Can detect unrealistically optimistic plans 100% Date

  34. Earned Value - Planning

  35. Earned Value - Planning • Can detect unrealistically pessimistic plans 100% Date

  36. Earned Value - Planning • Pitch of plan curve is the planned team throughput • Fixed teams and resources should yield straight line 100% Date

  37. Earned Value - Planning • Properly staffed and planned projects yield shallow S Construction done End of system testing 100% Planning Done Dev staffing Date Core team only

  38. Earned Value - Planning • Plan is both for progress and effort Progress, Effort % Date

  39. Earned Value - Example • When requirements, DD and test plan completed, the service is 45% done

  40. Earned Value - Example • Finding accumulated earned value:

  41. Earned Value - Example • Can track effort spent on activities • Time spent • Unrelated to progress

  42. Earned Value - Example

  43. Earned Value - Example

  44. Earned Value - Example

  45. Earned Value Projections • Can extrapolate actual progress line • Project actual completion date early on • Can extrapolate actual effort line • Project actual cost early on • Take corrective actions when still have effect

  46. % Plan Progress Effort 100% Planned Planned Completion Date Date

  47. % Plan Progress Effort 100% Planned Planned Completion Date Date

  48. % Plan Progress Effort 100% Planned Planned Completion Date Projected Completion Date Projected Schedule Overrun Date

  49. % Plan Progress Effort 100% Planned Planned Completion Date Projected Completion Date Projected Schedule Overrun Date

  50. % Projected Cost Plan Progress Effort Projected Cost Overrun 100% Planned Planned Completion Date Projected Completion Date Projected Schedule Overrun Date

More Related