1 / 36

A Stakeholder Benefit Approach to System Architecture for Exploration

A Stakeholder Benefit Approach to System Architecture for Exploration. Bruce Cameron January 19, 2007. Questions we’re trying to help answer How do decide between operational requirements and value activities? Ex: “How much mass should you allocate for science equipment?”

lorand
Download Presentation

A Stakeholder Benefit Approach to System Architecture for Exploration

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A Stakeholder Benefit Approach to System Architecture for Exploration Bruce Cameron January 19, 2007

  2. Questions we’re trying to help answer • How do decide between operational requirements and value activities? • Ex: “How much mass should you allocate for science equipment?” • What value should you design for, and what value is independent of the design?

  3. Agenda • Part 1 : Executable Stakeholder Models • Building stakeholder models • Sanity checks • Part 2 : Linking Value to Architecture • Proximate Metrics • Results • Part 3 : Recent NASA Studies

  4. Sustainability

  5. Research Question • Can benefit delivery models be used to differentiate between architectures? • Methodology: • Create a benefit model • Define test architecture variables • Link architecture to benefit using proximate metrics • Determine sensitivity of benefit to architectures • Secondary Question: How should the organization be designed to facilitate value delivery?

  6. Step 1: Create a Benefit Model • Define stakeholders • Construct benefit input-output models of stakeholders • Link outputs and inputs in a network model • Define rules for propagation of benefit in the model (how are inputs turned into outputs?) • Calibrate model using network statistics

  7. Beneficiaries Stakeholders Step 1a) Define stakeholders • Beneficiariesbenefit from your actions • You have a outcome or output which addresses their needs • You are important to them • Stakeholdershave a stake in your project • They have an outcome or output which addresses your needs • They are important to you • We chose eight stakeholder groups • US People • Executive and Congress • Educators • Media • Science • Security • International Partners • Economic

  8. NASA Mission and Event Content Public opinion and policy support Space tourism services Personal taxes Science and Engineering Inspired Students Media entertainment & information Stable & rewarding employment Security benefits political support (votes) Step 1b): Building a benefit model US People Awareness of benefits Pride and inspiration Quality of Life Knowledge of science and tech Goods & services, including health, derived from space technology

  9. Skilled & motivated workforce Stable and rewarding employment NASA market funding Economy opinions and policy support NASA contract funding Plans and Progress Reports Corporate Taxes Space resource knowledge Goods and services, including health, derived from space technology Exploration systems NASA Space Technology Commercial launch services NASA Launch and Space Services Space Tourism Services Media entertainment and information Security Contract Funding Economic Industrial base Technology transfer ability Commercial space activity GDP Contribution Human capital Commercial launch ability ` Scientific knowledge Space Acquired data

  10. NASA educational material NASA Mission and Event content Space acquired data Space resource knowledge NASA Instruments & Modules Funding Exploration systems Science systems NASA space technology Participation in international exploration missions NASA science funding NASA Events Science knowledge Policy Direction Plans and progress reports Commerical launch services NASA market funding International launch services NASA contract funding Media entertainment and information Internationally provided space systems NASA launch and space services Skilled and motivated workforce Exploration International Collaboration Exploration Missions Knowledge Supporting Exploration Human Capital Science opinions and policy support Access to high visibility events Part. in NASA exploration missions Protection against claims of sovereignty Stable and rewarding employment NASA science data

  11. Plans and Progress Reports Policy Direction Executive & Congress Taxes Funding Political Support (Votes) International Agreements Policy International Partners Int. Space Systems Money Employment US People Exploration Workforce Policy Support Space Data Technology Security Entertainment & Information Knowledge Commercial launch Mission Content Goods and Services Media Goods & Services,inc Health Space Technology Contracts Economic Science and Eng Students Exploration Systems Skilled Workforce Educators Science Data Science Science Knowledge Science Knowledge Step 1c) Value Network

  12. Opinions and Policy Support Commercial Launch Contracts Funding Step 1d) : Propagating Benefit • Trade between all types of flows using ‘perceived importance’ to the stakeholder • Assign a value to each link • Loop value is a product of links • Loop is only as strong as weakest link Executive Security Economy NASA One Dimensional Very Important V = 0.26 Must Have Extremely Important V = 1 One Dimensional Important V = 0.19 NASA Total Score = 0.26*1*0.19*1 = 0.049 Must Have Extremely Important V = 1

  13. How many loop segments are possible? Economy V = 0.19 NASA Market Funding Exploration Systems Industrial Base V = 0.19 Plans and Progress Reports V = 0.34 Space Tourism Possibility Future Market Knowledge Policy Support For NASA

  14. Step 1e: Benefit Sanity Checks • Based on the information provided, we can compute a number of statistics on the model.

  15. Plans and Progress Reports Science Opinions And Policy Support Science Funding Science Funding Policy Direction Stable Employment Stable Employment Stable Employment Stable Employment Policy Direction Votes Votes Votes Votes Contracts Policy Direction Science Funding Science Funding Exploration systems Science Systems Funding Funding NASA #1 Executive US People NASA NASA #2 Executive US People NASA NASA #3 Science NASA #3 NASA Science Executive US People NASA Executive NASA #5 NASA #5 NASA Science Executive US People NASA NASA #7 Science NASA NASA #8 Economy NASA

  16. Distribution of values Top 8 Top 17 Top 62

  17. Stakeholders weighted by ranking

  18. NASA Outputs Weighted by Ranking • Why is NASA Science Funding so high? • Most loops (89, c.f. 59 next highest) • Science knowledge is linked to so many other flows • Science data and Science Funding both input to Science Knowledge, but Science Funding also linked to other Science stocks, whereas Science Data isn’t.

  19. NASA Inputs Weighted by Ranking

  20. Step 2: Linking Architecture to Value • Value is subjective in the eyes of the beneficiary and hard to measure • We use proximate measures i.e. trajectory measures toward value • Metrics must differentiate between architectures! • In our last test, only 67% flows enabled architecture differentiation

  21. 1 1 1 1 1 Utility Curve Utility Curve Utility Curve Utility Curve Utility Curve 0 0 0 0 0 Diversity Speed Hours Area MPI Science Data Metric Product Each utility curve represents stakeholder input Quantity of Data Quality of Data Product Rover Speed Accessible Area Total Crew Science Hours Data Acquisition Rate Diversity of Sites

  22. Architecture Value Test Cases

  23. Ranges for Science Stocks and Inputs • Note: Blank bars represent variable = 1 Science Stocks Science Inputs

  24. Study 1: NASA Lunar Architecture • Robotic pre-cursors for environment and test capabilities • Focused campaign for outposts at polar site • Released December 2006 Photo courtesy NASA ESAS Report

  25. Study 1: NASA Lunar Architecture • Difficult to make use of a prioritization • Requires indication of how progress can be measured against these goals • Danger of ‘ticking the box’ for theme support, rather than defining what level of effort is required to accomplish the goal Goal Statement Slide from Deputy Administrator's presentation of Dec. 4, 2006

  26. Study 1: NASA Lunar Architecture • 180 objectives from broad stakeholder process • Objectives are written with widely different architecture scopes, which could be avoided by examining real stakeholder value • Only ~40% of LAT objectives will reasonably differentiate between architectures • Mix of solution-specific and solution-neutral objectives • Develop interactive video games based on lunar exploration to generate revenue and engage the public • Provide surface mobility capabilities to move crew outside the local area of a lunar outpost

  27. Study 1: NASA Lunar Architecture • Does not capture ‘exploration firsts’ either robotic or manned, as a public engagement mechanism • No detailed surface mobility objectives, no coupling to science or exploration Our process would have caught these! • Stakeholder analysis enables a sort of ‘completeness check’ Our process enables a weighting among themes via decomposition to value flows • Required mapping objectives to theme satisfaction

  28. Study 2: Lunar Robotic Architecture Study • 60 day exploratory study • “Do we need robotic [precursor] missions at all? If so, why and under what conditions?” • Baselines orbiter, fixed lander, mobile lander, rover, and communications relay NASA Polar Rover concept Apollo robotic probe - Surveyor 3

  29. Study 2: Lunar Robotic Architecture Study • Goals of the study generally well written • Metrics include extensibility / flexibility, but rate requirements instead of architectures • But no evidence of evaluation against metrics! • Requirements process: • ID full scope of requirements • Evaluate which ones might be useful in different scenarios • If requirement appears in any scenario, becomes part of LRAS baseline • Time phasing of requirements against design deadlines • Rough matching of requirements to timing through construction of different excursion option against pre-decided mission options. • Bias #1: Because of lack of ‘constellation’ requirements, biased towards including lots of capabilities in requirements • Bias #2: Because fixed mission types beforehand, emphasis is ‘what can we do on this mission’ rather than ‘what minimum work needs to be done’ • Problem: Eventual missions not related back to goals, value

  30. Organizational Implications • Areas of the organization impacted by architecture should be positioned to: • Provide input to the architect • Receive and translate the benefit to stakeholders • Be closer to the architecture than areas not affected • Where architectural choices settle conflicts between stakeholders, these decisions should be expressed to stakeholders to build buy-in.

  31. Organizational Implications • Organization should be aligned with its output: • How does NASA coordinate relationships with other agencies? • No arbiter / system architect making trade-offs between the types of outputs • International collaboration with NASA is a decentralized process • Where should this work be useful? • Many stakeholders • Fractionated power base • We differentiate, for better or worse, between activities related to external value, and internal activities.

  32. What Does Stakeholder Analysis Mean? • Define stakeholders & beneficiaries • Elicit needs and their relative importance. Look at projected output - incremental & fractionated, or simple & all at once? • Determine relative breakdown of stakeholder power via inputs • Look for realistic mechanisms for stakeholder input (late in process?) • Determine up front conflicts between needs before they become conflicting requirements. Cost or schedule or performance?

  33. Conclusion • Created network to help prioritize needs in a public engineering project • Demonstrated architecture discrimination by benefit delivery • Applicable to current NASA decisions • Weaknesses • No time-phasing • Metrics must be well calibrated, and are only approximations • Doesn’t trade internal capabilities against external value

More Related