1 / 50

OSP LL Final Review Agenda Thursday May 27, 2004

OSP LL Final Review Agenda Thursday May 27, 2004. 8:00 – 8:15 Welcome & Introduction Bob Hughes 8:15 – 8:45 Lessons Learned Topics Bob Hughes 8:45 – 9:15 Describe Process Part One Chris Crumbly 9:15 – 9:30 Encyclopedia Demonstration Jose Roman 9:30 – 9:45 Break

shira
Download Presentation

OSP LL Final Review Agenda Thursday May 27, 2004

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. OSP LL Final Review Agenda Thursday May 27, 2004 • 8:00 – 8:15 Welcome & Introduction Bob Hughes • 8:15 – 8:45 Lessons Learned Topics Bob Hughes • 8:45 – 9:15 Describe Process Part One Chris Crumbly • 9:15 – 9:30 Encyclopedia Demonstration Jose Roman • 9:30 – 9:45Break • 9:45 – 11:00 Describe Process Part Two Roger Mellot • 11:00 – 12:00 Topics Described Chris Crumbly • 12:00 – 1:00Lunch • 1:00 – 2:00 180° Feedback Session Roger Mellot • 2:00 – 2:15 Program Acceptance Review Plan Crumbly • 2:15 – 2:30 Closing Remarks Chris Crumbly

  2. OSP LL ALTERNATE Final Review Agenda • 8:00 – 8:15 Welcome & Introduction Bob Hughes • 8:15 – 8:45 Describe Process Part One Chris Crumbly • 8:45 – 9:00 Encyclopedia Demonstration Jose Roman • 9:00 – 10:00 Topics Described Chris Crumbly 10:00 – 10:15Break • 10:15 – 12:15 Describe Process Part Two Roger Mellot • 12:15 – 12:30 Closing Remarks ALL

  3. DESCRIBE THE PROCESS PART ONE CHRIS CRUMBLY

  4. Purpose • The purpose of the Final is to • Bring closure to the OSP Lessons Learned Data Gathering Process • Provide metrics to the OSP team • Present the plan for the final products • Demonstrate the Encyclopedia

  5. Important Dates June April May • Midterm • LL Cutoff • Themes (Listing) and Bibliography Submitted • Final Package/Volumes Submitted • Final at KSC • Program Acceptance at MSFC April 15 April 23 May 10 COB May 17 COB May 27 June 15

  6. Final Government Review Define Review Period Conduct Kickoff Meeting Conduct Volume Reviews Closeout Program Lessons Learned Process Volume LL Review ProcessFigure 3. To Code T Lessons Learned Process

  7. OSP LL Categories

  8. OSP LL Final Products • All LL are to be condensed into 1 or more targeted, summary presentations • OSP is bringing Roger Mellot on-board to serve as a facilitator • As the process matures, new directions will be supplied via the LL Forum • Volume owners are requested to create THEMES to further categorize their lessons learned • The purpose of the May Final is to evaluate the THEMES in preparation for the summary presentations Due to the number of THEMES, we developed TOPICS. OSP will create only 1 summary document—not presentations • The purpose of the June Program Acceptance Review is to review/accept the summary presentations document

  9. LESSONS LEARNED SELECTION CRITERIA Mid-Term Selection Criteria for Top 15 Lessons Learned • Lessons Learned that are applicable to the OSP Team in general. • Lessons Learned specific to Exploration & CEV • Lessons Learned applicable to NASA Program/Project Teams • For similar Lessons Learned the most generic of them will be selected. Final Selection Criteria for Lessons Learned Topics • Lessons Learned Themes specific to Exploration & CEV • Lessons Learned Themes applicable to NASA Program/Project Teams

  10. OSP LL Tiger Team • Bill Arceneaux/ Vehicle Engineering • Phil Weber//Launch Site Integration • Paul Gilbert/Integrated Operations • Bob Hughes/Chief Engineer • Jose Roman/Chief Engineer’s Office • Bill Jacobs/Requirements Mgt Office • D.K. Hall/Acquisition Mgt Office • Chris Crumbly/Projects and Engineering Office • Dale Thomas/Systems Management Office • Jim Beveridge/QTEC-SEIO • Roger Mellot/Facilitator • Shirley Brock/Administrative Assistant

  11. OSP LESSONS LEARNED METRICS TIGER TEAM VOLUME LEADS TOPIC 1 . . . . . . . . . . . TOPIC 16 V01.T01 . . . . . . . . . . . V044.T02 INITIATORS OSP.01.0135 . . . . . . . . . . . OSP.44.0025 Topics 16 Themes 154 Lessons 832

  12. Program Acceptance Review Plan • Purpose • Finalize the OSP Lessons Learned (LL) Task • Present the final products to the OSP Management Team (Big 3) • Demonstrate the Encyclopedia CD-ROM • Agenda • OSP LL Summary Presentation • OSP LL Encyclopedia Demonstration • OSP LL Encyclopedia Delivery • The Review will be on June 15, at MSFC, Bldg 4200 at conference room P110, 1:30-3:30 PM • Attendance • OSP Government/Support Contractor Team Only

  13. ENCYCLOPEDIA DEMONSTRATION JOSE ROMAN

  14. TOPICS DESCRIBED CHRIS CRUMBLY

  15. The 16 Topics

  16. Jewel Surprise Incomplete Theme Classification Definitions  Trap – Challenges we saw, encountered, but did not conquer – Traps we successfully avoided or conquered – Challenges we did not foresee – Things we were responsible for but performed inadequately – Open issues, risks, challenges, or future work that should be addressed – Near-term, time critical issue or action Dropped Ball ☺Heads up

  17. 1. Fire, Aim, Ready? (Planning) Ready, Aim, Fire! Plan to succeed – OSP planning documents were not sufficiently mature when the decision was made to embark on an accelerated program. Parallel development during program execution deadened the sense of urgency to complete them. Message: It is a mistake to proceed with a program without at least a fully developed, documented, and communicated Program Plan and Systems Engineering Plan.  Trap • Program Plan and SEMP The OSP Program Plan, Systems Engineering Management Plan (SEMP) and related foundational plans (Risk Management, Configuration and Data Management, Records Plans, etc.) were not sufficiently mature when the decision was made to embark on an accelerated program. • Schedule Planning and Implementation An OSP schedule guidelines document was not prepared early enough in the program execution. • Planning for Launch Vehicles OSP initially charged the prime contractors with designing an entire space transportation system, including the launch VEHICLE. Soon thereafter, the Launch Services Program and OSP agreed that LSP would provide the launch SERVICES.

  18. 1. Fire, Aim, Ready? (Planning) (cont.)  Trap • Planning for ISS OSP became a major ISS component by virtue of the NASA Level 1 requirements yet the ISS program was not fully engaged with OSP. • Program Shutdown NASA should establish procedures to perform an orderly shutdown of programs which are cancelled/terminated/transitioned. • Tailored Processes OSP generally implemented standard processes ad hoc. In many instances the program would have benefited from tailored processes. • Milestone Review Deliverables Contractor requirements traceability matrices did not fully address the issues of requirement rationale and allocation. • Rules of Engagement-Risk Government technical experts’ roles in the risk management process were not clearly defined or communicated. • Rules of Engagement-Other Uncertainty about the allowable interactions between NASA and contractor engineers hampered early and open participation. • LSP and OSP Cultures Two different cultures and paradigms (manned vs. unmanned) were brought together, but never merged successfully • Government Concept The government concept role in the requirements analysis cycle • (RAC) was clear, but its role in the design analysis cycle (DAC) was unclear

  19. 1. Fire, Aim, Ready? (Planning) (cont.)  Trap • Flight Test Planning Flight test planning has a major impact on technical, schedule, and • budget planning and should be considered in up-front planning. • Facilities Planning Facilities development was part of prime contractor responsibility yet insufficient time existed between OSP contract award and facility first use.  Dropped Ball • Level 1 Requirements Program planning includes planning from above. The process by which the Level 1 Requirements were developed was not clear to the program implementers. • Level 2 Requirements Development of the Level 2 requirements did not follow established systems engineering guidelines for allocation, inclusion of performance and functional requirements, validation, and feasibility assessments. • Systems Engineering and Integration The Program Systems Engineering Office was not viable due to inadequate staffing, unclear roles/responsibilities, inadequate planning, and multiple organizations performing systems engineering functions. • Trade Studies The trade studies were initially un-integrated.

  20. 2. Inter-Program Relationships and Dependencies Integrate the Integrators. OSP did not have totally effective relationships with the Launch Services Program (LSP) or International Space Station Program (ISSP) at all working levels. The program partners did not create mutually-beneficial alliances and the agency did not delegate sufficient authority and allocate sufficient resources to the program partners to resolve inter-program issues or establish clear means to reconcile disputes. Message: Memorandum of Understanding (MOU) between programs should require commitment from above and describe implementation below and across the affected parties. • “Top Dog” The OSP Program was given requirements to provide crew transfer and rescue service to ISS utilizing Atlas and Delta launch services. No sole source of authority formally directed all three programs (LSP, ISS, LSP) to successfully achieve these requirements and no-one held them jointly accountable for creating and operating the caliber of relationships that would have been required for success. • International Space Station ISS and OSP were not positioned by the agency as “customer” and “supplier”. c. Launch Services Program OSP was a very different customer for LSP than they were used to. OSP wanted a development partner instead of a service provider.  Trap

  21. Jewel 3. Organizational Design Organize to execute. OSP was staffed with very capable people, yet was organized more around the people and their home organizations rather than around the work to be performed. The organizational structure must reflect the capability and the required input and output of each entity. Organizational capabilities and cultural differences must be addressed when establishing working relationship with outside organizations. Message: Design and staff the organization for the work, not to satisfy politics or make the masses happy. Force resolution of issues by product definition/exchange, schedule, and interoperability commitments prior to signing final agreements. • Inter-Program Integration Integration with the LSP and ISS was unclear. b. Center Divisions The OSP program was a diverse collection of experts from across the agency. However, when faced with infrastructure challenges, historically distributed specialties and most importantly resource allocations the organization split among center organizational boundaries.  Trap • Integrated Operations There was recognition that the Ground System must be worked in an integrated fashion with the flight system.

  22. 3. Organizational design (continued)  Dropped Ball Systems Engineering Office The Program Systems Engineering Office was not viable due to inadequate staffing, unclear roles/responsibilities, inadequate planning, and multiple organizations performing systems engineering functions.

  23. 4. Centralized Technical Management Who’s on First? Failure to quickly/clearly charter and mobilize a strong centralized technical authority with responsibility to make binding technical decisions across all elements within OSP and across program boundaries weakened the program’s Phase A technical implementation. Message: Establish a Systems Engineering and Integration role with the authority to integrate all program technical elements, organizations, and other affected parties. Establish a Program Chief Engineer with the overall technical authority to negotiate technical agreements among programs and resolve impasses within the program. • Systems Engineering and Integration Authority Systems Engineering and Integration (SEI) must be granted authority commensurate with responsibility. • Government Trades NASA should not rely on the contractors to trade Government assets that are beyond their control. c. Central Collection of Risks Development of a system that involves multiple programs requires a central collection and control of individual program risks that affect the overall system.  Trap

  24. 4. Centralized Technical Management (continued)  Dropped Ball • SEMP Under Pressure The greater the schedule pressure, the more important it is to establish, follow, and enforce a Systems Engineering Management Plan early. SEMP A good Systems Engineering Management Plan (SEMP) must be established early to synchronize program planning, requirements development, and program execution. ☺Heads up

  25. 5. Risk driven management Risk Management is not a mantra. Risk Driven Management was intended to be at the center of the Space Launch Initiative (SLI)/OSP modes of operation. Many changes in risk systems, a shortage of risk experts, and ineffective application demoted risk management to a box to be checked. Message: Define the risk management system to be used by the program at program/project onset, integrate it with partner programs’, contractors’, and stake-holders’ risk programs and manage by it. • Integrate Risk Management OSP did not develop an integrated process for assessing and scoring system risks. Nor was OSP risk management integrated with the LSP or ISS program risks. b. Risk Management Tools OSP Risk management tools were not vetted, trained for, nor fully implemented.  Trap

  26. 6. The requirements dilemma Stay true to the process! A lack of a disciplined, well communicated, Level 1 requirements development process, open or architecture-free Level 2 requirements, and contractor-developed Level 3 requirements caused conflict within the OSP program. Rigorous requirements development processes consistent with accepted system engineering practices are critical to establishing a good requirements foundation. The Requirements Development Team (RDT) attempted to insert rigor during the Level 2 requirements development in a short time and with a laser-like focus. But for team members not in the laser’s light, the RDT was a blackout. Message: Rigorous, well communicated, requirements development processes and rationale is paramount to stakeholders’, technical experts’ and contractors’ interpretation of the requirements. • Communication of Requirements The OSP Program encountered numerous problems with communication in the development of requirements. • OCD in Parallel The Operations Concept Document development was performed in parallel with the system requirements, which resulted in disconnects. • Multiple Level 2 Documents The Level 2 requirements spread across multiple documents made for confusion.  Trap

  27. Jewel Surprise 6. The requirements dilemma (continued)  Trap • Too much Trade Space The notion that “saying less gives the contractor more freedom” isn’t • entirely true. • Validation Issues Validation became a difficult task when it came to dealing with multiple • contractors. Requirements Philosophy A requirements philosophy paper was written to help communicate the message. Level 1 Requirements The Level 1 requirements development process was lacking in rigorous system engineering.

  28. 7. Crew Survival and Human Rating It’s for the crew! The NASA human ratings requirements are not yet "field tested" by application to development through design or flight certification.  OSP learned enough just incorporating them into its requirement baseline to warrant an update to the NASA requirement documents. Message: DO NOT under estimate the impact of human rating and crew survival requirements on ALLaspects of a human spaceflight program.   • Water Survival Failure to incorporate a complete and validated set of crew survival requirements into the program early led to substantial architectural redesign. b. Human Rating of ELVIf ELVs are planned to be used to launch crewed spacecrafts, the human rating adequacy of the ELVs should be assessed and documented prior to committing the program to their use.  Trap

  29. 7. Crew Survival and human rating (continued)  Trap c. Feasibility Assessment The process for demonstrating requirements feasibility was unclear. Jewel d. Human Rating Plan The program’s decision to address the NPG 8705.2 requirements early by developing the Human-Rating Plan (HRP vol I) in parallel with the rest of the level II requirements development had a very positive effect.

  30. Incomplete 7. Crew Survival and human rating (continued) e. North Atlantic Aborts It was identified early in the program that aborts into the North Atlanticcreated many design and operational challenges. f. Immature Human Rating Requirements The Human Rating Requirements and Guidelines did not clearly identify the phase of the Program where they apply nor was it clear how the requirements could be tailored that did not impart subjectivity in how they were approved.

  31. 8. Clashes and Walls (Communication) Can we talk? OSP systems design activities were impacted as communications became an issue, program-to-program cultures clashed, and internal program integration encountered obstacles and walls. OSP achieved a level of collaboration among various centers that may be unprecedented, yet conflicts still remained. Message: Programs cannot expect to modify other programs and organizations processes without, first, agreement, and, second, proper implementation. MOUs should require commitment from above and describe implementation below and across the affected parties.  Trap a. Effective 2-way Communication in any large program is problematic Program forums did not allow for discussion/questions from the implementing team members so that the basis of the marching orders could be readily understood. b.Cultural and priority differences (crewed vs uncrewed) between the Launch Services Program (LSP) and the OSP organization resulted in a suboptimized system design.

  32. 8. Clashes and Walls (Communication)(Cont)  Trap c.ISS program was not fully engaged/integrated with OSP program Surprise • Constant clash between technical experts and the OSP Program • philosophy (DoD-like) for design participation

  33. 9. Rigor & Integrity Don’t give in! The OSP Program technical content and review standards were diluted as the program accelerated and budgets adjusted. Technical products were compromised by abandoning established processes, tools, and standards of performance. The intent was to catch up later, but the bow wave was growing. Message: Altered plans produce altered results. Address technical excellence and the increase in program risk when modifying the program to meet revisions to schedules and budgets.  Trap • Unfair Trades The OSP contractors were asked to perform trades with major political • implications such as the location of the OSP Mission Control Center as well as the usage of • other major facilities. • b. Feasibility Assessment The process for demonstrating requirements feasibility was • unclear. Programs should define a rigorous process for technical feasibility (in the SEMP) • and especially feasibility of Human Spaceflight on existing ELVs. • Life Cycle Cost (LCC) Estimate Requirements OSP requirements for LCC estimates were not clearly defined.

  34. 9. Rigor & Integrity (continued)  Dropped Ball • c. Synchronize Development Requirement development, analyses, and system design • activities were not synchronized. • d. Review Integrity The System Requirements Review (SRR) was successfully held • according to the schedule, yet a large part of the contractor submitted documentation was • ‘de-emphasized’ by the Program, and there was little or no feedback on the comments and • Review Item Discrepancies (RIDs) submitted on the documents that were reviewed. Only • Program (government) developed documentation was reviewed for the success criteria.

  35. 10. Design Participation & Rules Of Engagement Team of Teams. Because of our zeal to protect the competitive environment, OSP had problems defining the proper technical engagement with the contractor teams. In the end, we were getting real traction in the right direction with the participation of our experts in the prime contractors’ design efforts. But we started staffing and empowering the engineering expert teams later than we should have, and the early constraints on their participation were too severe. Message: In competitive environments, decide, define and communicate up front how NASA participation is to be conducted with the contractors and staff to support that participation.  Trap • Systems Experts OSP did not form a standing team of human spaceflight system experts until after the contract data deliverables were defined and many of the government and prime contractor system-level trade studies and requirement development analyses were completed. • Design Participation During Competition The single largest start-up obstacle we encountered when trying to get the engineering teams engaged on OSP were the limits on their participation. • Deliverables—required verses needed The newly established engineering expert teams found themselves inundated at major program milestone reviews with contractual deliverables that were heavy on paper and light on data.

  36. 10. Design Participation & Rules Of Engagement(cont)  Dropped Ball • Participation in Risk Management The role of the design participant must include responsibilities for control of the government’s risk. • Release of Government Data OSP’s initial policy was not to release any government trade or requirement development analysis data. Ultimately, recorded very useful guidance shortly before the SDR that allowed the teams to share data unless it potentially implied a government-preferred solution, or was proprietary, or was pre-decisional or acquisition sensitive.

  37. Jewel 10. Design Participation & Rules Of Engagement(cont) • Rules of Engagement OSP Developed and tested a set of Rules of Engagement by which we participated with two competing contractors equally but separately. • Data Release Policies OSP Developed, but did not test, policies for government trade • study data release to the two competing contractors. Surprise • Current Competencies New human spaceflight systems are not developed frequently - less often via competitive acquisition. • Contractor’s view of design participation We found during lessons-learned discussions with our prime contractors that they were not fully aware of the design participation rules of engagement we had established for ourselves.

  38. 11. Technical Standards & Documents What do you REALLY want? OSP implementation was impacted by a lack of shelf-ready standards and specifications for human-rated space flight. Additionally, OSP initially chose not to apply NASA preferred industry and Government standards common to high reliability space systems. Message: Development of a minimum set of design standards specifically applied to support all future launch vehicle and spacecraft development will aid design efforts and prevent the requirements creep commonly experienced with gross use of applicable documents. ☺Heads up • Lack of Shelf Ready Specifications Human-rated space flight is lacking shelf-ready standards and specifications in a number of areas. • Limit Applicable Documents Limiting applicable documents to a controlled set will save future development contractors from tailoring applicable documents to what it thinks the government wants. • c. Standardized Access to Standards Once these standards sets are established for future development programs, they need to be organized in a central location, accessible to government and industry alike.

  39. 12. Long-lead time bombs Tick, Tick, Tick…Some of OSP’s unmitigated risks were long-lead issues such as the design and development of autonomous rendezvous and docking systems, construction and/or modification of facilities, and purchase of launch services for early test flights. These same issues appear to be transferable to the Exploration Initiative. Message: Recognize and resolve some common long lead issues/risks early in the program. ☺Heads up • a. Construction of Facilities Launch site facilities planning/design/construction or • modification are typically 5 to 6 years in length for most major facilities, and should be considered immediately if the first flight is within that timeframe. • b. Integrated Health Management Planning Integrated health management (IHM) systems are often considered to be a “magic bullet”. But IHM is dependent on early definition of the system architecture as an integrated system. • c. Long Lead Development Items Specific long-lead items that should be addressed quickly include development and production of a docking mechanism (DM) and Mating Adapter (MA), as well as corresponding autonomous rendezvous system.

  40. 13. Fallout from stress cracks Stress Concentration Factor. (Issues worth repeating) A stress crack can grow only if the total energy applied increases or remains constant. Acceleration and budget reductions created stress cracks in the OSP management processes. Lack of defined plans and policies kept the energy level steady or climbing. Communication flow suffered, serial processes became parallel, rigor was diminished, record keeping got lax, and training went to the back burner. Message: Plans, procedures, and policies must be robust enough to withstand the inevitable change in schedules, budgets, and risks.  Trap • Review Criteria A specific example of stress cracks during the OSP formulation was the lowering of standards/success criteria for major Program reviews, such as SRR and SDR • b. Tightening of the Inner Circle After each occurrence of increased stress (budget cut, schedule accelerated) the “inner circle” of the Program Management team seemed to shrink and data flow to the mid-level managers and workers was reduced. • c. Focus on Operability The OSP Program’s commitment for ensuring “operability” in the design, fell by the wayside when pressure was building.

  41. 13. Fallout from stress cracks (continued)  Trap • d. Find the “no” Point Ultimately, during times of “belt-tightening” things will inevitably be given up…but it would be very advantageous to determine in advance (during the calm periods) the real priorities that can not be allowed to degrade under increased schedule, budget, or political pressure. • Synchronize Development Requirement development, analyses, and system design activities were not synchronized. • Program Plan and SEMP The OSP Program Plan, Systems Engineering Management Plan (SEMP) and related foundational plans (Risk Management, Configuration and Data Management, etc.) were not sufficiently mature when the decision was made to embark on an accelerated program. • Schedule Planning and Implementation An OSP schedule guidelines document was not prepared early enough in the program execution.

  42. 14. Credibility Tools & Models Show me the data. Credibility in cost and schedule estimating is key in convincing stakeholders that a program can execute within resource constraints. OSP recognized this issue and placed great emphasis on developing credible cost tools and models that produce credible results. Significant lack of clarity exists in operations cost models for human space flight vehicles. Models were being developed and vetted but more work is required. “Smart buyer” technical tools and models require a comparable emphasis. Our government evaluators must be able to independently validate contractor’s analyses, tools and models. Message: Place emphasis on providing credible cost and schedule estimations to stakeholders. Take the time to create a NASA team of smart buyers.  Trap • a. Adequate Operations Cost Estimating Tools Obtaining adequate operations cost estimates was a struggle through-out the program studies. • b. Contractor Trade Studies With Insufficient Data Trade studies were being evaluated in some cases without enough background information or knowledge. • c. Design Participation And Rules Of Engagement Teams of human spaceflight system experts and integrated system analysts should have been formed early enough to collaborate on trade study and requirement development analysis design, in order to assure the relevance of objectives and the validity of assumptions, initializing conditions, and models.

  43. Incomplete Jewel 14. Credibility Tools & Models (Continued) ☺Heads up • d. Human Space Flight Operations Cost Model The lack of a cost modeling tool for human flight operations significantly increases the time and effort required to generate costing figures. e. Operations Cost Model The OSP program recognized the need for a valid Human Space Flight Operations Cost Model to support this and future human space flight programs and had begun development. • OSP Cost Estimating Team The lack of cost credibility in the SLI Program required a fresh approach to providing supportable, defendable, and credible life cycle cost estimates. • Independent External Review Of NASA Cost Estimating Processes The lack of cost credibility in the SLI Program required that the government determine, via independent sources, if the best possible cost estimating and analysis practices were being followed on OSP.

  44. Jewel 15. Inspire to Acquire Engineers as Smart Buyers. NASA generally procures major developments years apart rather than months apart. OSP started up the learning curve early by creating the Acquisition Management Office (AMO) specifically for developing the acquisition strategy, Request For Proposal (RFP) development, and Source Evaluation Board (SEB) support. AMO brought our procurement specialists in at the beginning, trained in advanced procurement methods, benchmarked other successes, and included the contractors in the RFP development. Message: Plan how you buy as carefully as what you want to buy. • Transition from Development to Operations During development of the Design and Development RFP, there was much discussion concerning when and how to transition to an operational system to ensure appropriate attention to the transition and to ensure flexibility for the government. • Acquisition Management Office OSP started up the acquisition learning curve early by creating the Acquisition Management Office (AMO) specifically for developing the acquisition strategy, building the RFP, and supporting the SEB. • Acquisition Training A training regimen was instituted very early in a program’s life for acquisition, procurement, and Source Evaluation Boards.

  45. Incomplete Jewel 15. Inspire to Acquire (continued) • d. Competitive Contractor Feedback Government/contractor meetings that involved all bidders were of almost no value, because the contractors would not communicate openly. • e. Coalition for Acquisition RFP development was a successful collaboration of engineers, managers, and procurement specialists from across the country • d. Ongoing Work Leading to Proposal The OSP program and the SEB worked very hard at creating a linear relatinship between the architecture work being conduced and the proposal to be delivered in response to the full scale development RFP.

  46. Jewel 16. Operational Design What do the users think? “A user will tell you anything you ask about—and nothing more.”[1] So, OSP asked many, asked early, and asked often. An operational focus was applied by aligning operations equivalent to engineering in the organization and keeping flight, ground, and integrated operations engaged throughout the development. Message: Systems operations must be planned for up front and forced into the design and facilitization. ] Project Management by Kertzner • Integrated Operations There was recognition that the ground system must be worked in an integrated fashion with the flight system. • b. Operators in the Acquisition Operational conscience extends into SEB membership • .

  47. Part C. Heads Up and Other Considerations • NASA is not a frequent, nor efficient acquisition organization. Examine the cultures of DoD acquisition and NASA acquisition in order to strike the most efficient balance. • Trades, analyses, and studies that were performed to increase our “smart buyer” posture endangered the opportunity to get pure contractor-derived designs. • When considering commercially available secure, web-based IT tools consider the stress that large volumes of data required for program implementation will place on the system • National Environmental Policy Act approval process should be addressed early • Sustaining engineering philosophy needs to consider long term outlook for part availability on a potentially small fleet of spacecraft • NPG 7120.5 and SP 6105 are largely targeted to single contractor programs/projects. Some of their guidance is not achievable with competing contractors • Industrial base for human space flight vehicles is not as large, as knowledgeable, and as engaged as we had assumed. • Volume 11, Theme 5 is a collection of data that may be very applicable to Exploration activities. Some of the issues researched and discovered are: • GPS backup is required during some flight regimes • Advantages of non-toxic/environmentally friendly propellants • Liquid propellant systems are fast enough to perform aborts from the pad in the event of an accident

  48. Part C. Heads Up and Other Considerations (cont) • Development of habitation mockups prior to SDR was beneficial for human engineering assessments. Little has been done in the assessment of acceptable capabilities for human volume requirements since Apollo. • Some of the common business reporting Data Requirements Document Templates (MA-010 for example) do not require the specificity of data and analysis needed to provide accurate reporting. • Some of the business management tools used by OSP were very useful: • Configuration Control of Program (POP) Guidelines • RPS Database • “Smartbooks” containing most relevant resource information and supplied to management team • WBS/UPN structured database for data collection

  49. DESCRIBE PROCESS PART TWO ROGER MELLOT

  50. CLOSING REMARKS ALL

More Related