1 / 25

Software Cost Estimation Basis of Estimate <Project Name>

Software Cost Estimation Basis of Estimate <Project Name>. <Date> <Author>. How to Use This Template. This is the template for documenting a Software B asis Of Estimate (BOE) for the Flight Software Systems Branch (LSS) at NASA GRC.

laken
Download Presentation

Software Cost Estimation Basis of Estimate <Project Name>

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Software Cost EstimationBasis of Estimate<Project Name> <Date> <Author>

  2. How to Use This Template • This is the template for documenting a Software Basis Of Estimate (BOE) for the Flight Software Systems Branch (LSS) at NASA GRC. • In the menu, choose View -> Slide Master. Select the first slide and change the name and date in the footer to appropriate values. Click on the red X above Close Master View to return to presentation. • Guidance on the type of information that belongs on each slide can be found in the NOTES section on each page. • For very early estimates, you likely will not know the information for a number of the slides. You may want to just make a note on the slide that the information is not known at this pointinstead of deleting the slide. These slides can serve as a reminder of what information you should include when you update the estimate later on. • Any italicized text on a slide is either an example or further notes and should be replaced with project-specific information or removed. • REMOVE THIS SLIDE FROM YOUR COMPLETED BOE. Contact: Lisa VanderAar lvanderaar@nasa.gov 216-433-6514

  3. OUTLINE • Basis of Software Estimate • Scope • Assumptionsand Constraints • Software Block Diagram • CSCIs • Software Work Breakdown Structure • Software Size Estimate • Results of Software Estimate • Schedule and Effort Estimate • Other Direct Costs • Software Estimate Drivers • Risks • Cost Drivers • Customer Negotiated Changes

  4. BASIS OF SOFTWARE ESTIMATE

  5. Scope • This estimate includes the software for: • <ex. flight experiment control> • <ex. navigation> • <ex. ground command and control> • <ex. DAC simulator> • This estimate does NOT include: • <ex. software assurance> • <ex. communications software> • <ex. CAT server software> • <ex. other software or functionality is outside of the scope of the estimate> • The following lifecycle phases are included in the estimate: • <ex. support of system requirements development, software planning, software requirements, design, coding, software testing, support of system testing> • <ex. formulation through delivery, maintenance, operations support, environmental testing support…>

  6. Assumptions and Constraints • High Level Assumptions: • <assumption 1> • <assumption 2> • <assumption 3> • <assumption 4> • Constraints: • <constraint 1> • <constraint 2> • <constraint 3> • <constraint 4>

  7. Software Block Diagram Flight Computer Simulator Computer Sensor Inputs Labview Simulator Data Handling Effector Inputs Guidance and Navigation Sensor Data Control Data EXAMPLE DIAGRAM H&S Data Guidance, Navigation, and Control Data from Radio Communications Data to Radio KEY CSCI Computer HW Data Flow

  8. Computer Software Configuration Items <CSCI x> • Classification: NASA Software Class <x>, <Not> Safety Critical • Description:<Description of functionality> • Safety Critical Functionality:<If applicable, description of safety critical functionality> • Assumptions/Constraints: <CSCI-specific assumptions/constraints> • Interfaces: <Interfaces> <Example: XYZ Ground Software CSCI> • Classification: NASA Software Class C, Not Safety Critical • Functionality: Ground command and display software for the XYZ experiment. • Safety Critical Functionality: N/A • Assumptions/Constraints: • Operators will be located in the GRC Telescience Center. • Windows 7 PCs will be used. • Displays will not be reconfigurable. • Interfaces: MSFC HOSC, Data Storage CSCI

  9. Software Work Breakdown Structure (1 of 5) 1.0 Software Management 1.1 Management / Admin 1.2 SW Configuration Management 1.3 SW Quality Assurance 1.4 Software COTR 1.5 Technical Lead 1.5.1 Write Task Agreement 1.5.2 Peer Reviews / Inspections 1.5.2.1 Code Inspections 1.5.2.2 Document Reviews 1.5.3 Metrics Collection and Analysis 1.5.4 Status Meetings 1.5.5 Process Waivers 1.1.1 Sys Risk and Security Assmt 1.1.2 Software Mgmt/Dev Plan 1.1.3 Schedule Est and Mgmt 1.1.4 Cost Est and Management 1.1.5 Effort Est and Management 1.1.6 Procurements 1.1.7 Waivers 1.1.8 Status Reporting 1.1.9 Metrics 1.3.1 SW QA Plan 1.3.2 SW Safety Plan 1.3.3 Functional Configuration Audit 1.3.4 Physical Configuration Audit 1.3.5 Process Audits 1.3.6 Code Inspections 1.3.7 Doc Reviews 1.3.8 Verification Testing Witness 1.2.1 Software CM Plan 1.2.2 Configuration Items 1.2.3 SW CM 1.2.3.1 Code Management 1.2.3.2 Build Management 1.2.4 Change Request Process 1.2.5 Problem Reports 1.4.1 Write RFP 1.4.2 Negotiate with Vendors 1.4.3 Assess Proposals 1.4.4 Select Vendor 1.4.5 Inspection of Delivered Products 1.4.6 Reviews 1.4.7 Status Meetings 1.4.8 Metrics 1.4.9 Waivers 1.4.10 Task Evaluations

  10. Software Work Breakdown Structure (2 of 5) 2.0 SW Systems Engineering 2.1 Software Requirements 2.2 Interface Requirements 2.3 Lessons Learned 2.4 Software Integration 2.3.1 Collect as project progresses 2.3.2 Search for LL during each phase 2.3.3 Recommend best practices 2.1.1 Allocate system requirements to CSCIs 2.1.2 Requirements traceability 2.1.3 Review and update Operations Concept 2.2.1 Determine and doc external interfaces 2.2.2 Determine and doc internal interfaces 2.2.3 Document software interface requirements 2.4.1 Build process 2.4.2 Perform builds 2.4.3 Create build documents 2.4.4 Load builds onto hardware 2.4.5 Version Description Document

  11. Software Work Breakdown Structure (3 of 5) 3.0 Software Development 3.1 Architecture 3.2 Detailed Design 3.3 Code 3.4 Test 3.3.1 CSCI 1 Implementation 3.3.2 CSCI 2 Implementation 3.3.3 Simulator / Emulator Code 3.3.4 Test Code 3.3.5 Code Inspections 3.3.6 Coding Standards 3.1.1 CSCI 1 Architecture 3.1.2 CSCI 2 Architecture 3.1.3 PDR Presentation 3.2.1 CSCI 1 Design and Document 3.2.2 CSCI 2 Design and Document 3.2.3 CDR Presentation 3.4.1 CSCI 1 Unit Testing 3.4.2 CSCI 2 Unit Testing 3.4.3 Combined CSCI Testing 3.4.4 Simulator Acceptance Testing

  12. Software Work Breakdown Structure (4 of 5) 4.0 System Testing 4.1 Planning 4.2 Procedures and Testing 4.3 Reports 4.4 Mission Simulations (Validation) 4.3.1 Write Test Report(s) 4.3.2 Verification Closure Notices 4.3.3 Update Verification Matrix w Results 4.3.4 Status Project and Line Management 4.1.1 Software V&V Plan 4.1.2 Software Verification Matrix 4.1.3 Problem Reports and Redlining Process 4.1.4 Review Process 4.2.1 Write Test Procedures 4.2.2 Plan Testing Effort 4.2.3 Run Tests 4.2.3.1 Document Test Results 4.2.3.2 Document Redlines 4.2.3.3 Document Problems 4.2.4 Update Procedures 4.2.5 Regression Testing 4.2.5.1 Regression Testing Matrix 4.2.5.2 Perform Regression Testing 4.4.1 Schedule Simulation 4.4.2 Plan Personnel 4.4.3 Write Procedures 4.4.4 Run Simulation 4.4.5 Write Report and Product Acceptance Document

  13. Software Work Breakdown Structure (5 of 5) 5.0 Training 5.1 Product Training 5.2 Process Training 5.3 System Training 5.4 Training for Delivered Products 5.3.1 Develop SW User’s Manual 5.3.2 Develop Training of System 5.3.3 Train the Operations Team 5.1.1 Determine Training Needs 5.1.2 Find or Create Training 5.1.3 Keep Training Records 5.2.1 Determine Processes and Standards to Follow 5.2.2 Find or Create Training 5.2.3 Keep Training Records 4.4.1 Training for Delivered Product 1 4.4.2 Training for Delivered Product 2

  14. Software Size Estimate <CSCI x> • Language: <programming language used> • <Explanation sizing method(s) used> <Example: File Manager CSCI> • Language: C • Reuse CFS File Manager application with ~5% code modification. • Reusing code that was written to be reused. No re-engineering required. • The code will need to be re-tested. Automated unit test script exists but will need to be updated for project-specific changes. <Example: XYZ Ground Software CSCI> • Language: Java • Similar in size and complexity to ABC ground software. • ABC displays were reconfigurable, but XZY displays are not required to be. • Re-configurability estimated to be ~10% of the total SLOC.

  15. Software Size Estimate *Effective SLOC = New SLOC + (X%) Modified SLOC + (Y%) Reused SLOC

  16. SOFTWARE ESTIMATE

  17. Software Schedule and Effort Estimate Example chart from SEER-SEM output:

  18. Software Schedule and Effort Estimate Example chart showing estimates for two different confidence levels:

  19. Software Schedule and Effort Estimate Example spreadsheet of a very early estimate...

  20. Other Direct Costs This slide may be as simple as a list of items that might need to be purchased or as detailed as a pasted spreadsheet containing a prioritized list of cost details broken out by fiscal year. It depends on what makes sense for the particular estimate you are doing.

  21. SOFTWARE ESTIMATE DRIVERS

  22. Risks RISK: <Example: Given that the project requires extensive experience in XML, there is a high risk that a person with the requisite skills will not be available for this project.> • LIKELIHOOD: <x> CONSEQUENCE: <x> • MITIGATION: < Example: Extra training costs and a lower productivity rate was assumed.> • COST OF MITIGATION: <Example: XML Training for 2 people $3000, “Personnel Experience” in SEER-SEM was changed to “Low +”.> RISK: <Example: Given that the interfaces with the carrier are still not clearly defined, there is a high risk that this will extend the software schedule.> • LIKELIHOOD: <x> CONSEQUENCE: <x> • MITIGATION: < Example: Extra slack time and an additional person were added during testing to account for the likely rework. Additionally, an ICD is planned to be generated.> • COST OF MITIGATION: <Example: One additional person added during testing phase. Testing schedule extended 6 mos. Additional ½ person effort added during design phase to account for ICD.>

  23. Cost Drivers • <EXAMPLE: Cost drivers such as complexity, requirements volatility, personnel capabilities and experience, and number of interfaces. > • Safety-critical requirements and/or requirements for high security.

  24. CUSTOMER NEGOTIATED CHANGES TO THE SOFTWARE ESTIMATE

  25. Customer Negotiated Changes EXAMPLE: Discussed estimate with <Customer Name> on 4/15/2015: • Customer is only able to afford 4 software people during years 2 and 3, while the original estimate showed that we needed 5. • We agreed to de-scope the optics interface, which was going to be a significant driver during years 2 and 3 of development. • We also agreed that we will be able to keep all 4 software people in year 4 instead of downsizing to 3 people as shown in the original estimate. The additional person will be able to help to finish writing test procedures that may have lagged behind due to having less people available during the development phase.

More Related