A Survey of Systems Engineering Effectiveness
Download
1 / 74

A Survey of Systems Engineering Effectiveness by: NDIA Systems Engineering - PowerPoint PPT Presentation


  • 119 Views
  • Uploaded on

A Survey of Systems Engineering Effectiveness by: NDIA Systems Engineering Effectiveness Committee INCOSE - Orlando Chapter Geoff Draper Harris Corporation [email protected] February 28, 2008. Agenda. Introduction – NDIA Systems Engineering Division (SED) Organization and Committees

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' A Survey of Systems Engineering Effectiveness by: NDIA Systems Engineering ' - jefferson-tommy


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

A Survey of Systems Engineering Effectiveness

by: NDIA Systems Engineering

Effectiveness Committee

INCOSE - Orlando Chapter

Geoff Draper

Harris Corporation

[email protected]

February 28, 2008


Agenda
Agenda

  • Introduction – NDIA Systems Engineering Division (SED)

    • Organization and Committees

  • NDIA Systems Effectiveness Committee:

  • A Survey of Systems Engineering Effectiveness

  • http://www.sei.cmu.edu/pub/documents/07.reports/07sr014.pdf

NDIA SE Division web page:

http://www.ndia.org/Template.cfm?Section=NDIA_Divisions_Page&Template=/TaggedPage/TaggedPageDisplay.cfm&TPLID=3&ContentID=677



Key ndia se division initiatives
Key NDIA SE Division Initiatives

  • OSD (A&T) Initiatives

  • CMMI Co-Sponsor

  • Conferences

    • NDIA Systems Engineering Conference

    • CMMI Technology Conference

    • Net-Centric Operations Conference

  • Committees

  • Task Groups / Workshops

  • Awards

    • Ferguson Award for SE Excellence

    • Top 5 Government Programs


Effective Systems Engineering: What’s the Payoff for Program Performance?

NDIA Systems Engineering

Effectiveness Committee

CMMI Technology Conference

November 15, 2007


Does this sound familiar
Does this sound familiar? Program Performance?

These are the ASSERTIONS, but what are the FACTS?


The problem
The Problem Program Performance?

  • It is difficult to justify the costs of SE in terms that program managers and corporate managers can relate to.

    • The costs of SE are evident

      • Time

      • Effort

    • The benefits are less obvious and less tangible

      • Cost avoidance (e.g., reduction of rework from interface mismatches

      • Risk avoidance (e.g., early risk identification and mitigation)

      • Improved efficiency (e.g., clearer organizational boundaries and interfaces)

      • Better products (e.g., better understanding and satisfaction of stakeholder needs)

How can we quantify the effectiveness and value of SE?How does SE benefit program performance?


Systems engineering effectiveness survey 2004 2007
Systems Engineering Effectiveness Survey Program Performance?(2004-2007)

  • Hypothesis: The effective performance of SE best practices on a development program yields quantifiable improvements in the program execution (e.g., improved cost performance, schedule performance, technical performance).

  • Objectives:

    • Characterize effective SE practices

    • Correlate SE practices with measures of program performance

  • Approach:

    • Distribute survey to NDIA companies

    • SEI analysis and correlation of responses

  • Survey Areas:

  • Process definition Trade studies Project reviews

  • Project planning Interfaces Validation

  • Risk management Product structure Configuration mgmt

  • Requirements development Product integration Metrics

  • Requirements management Test and verification


The challenge previous studies summary
The Challenge Program Performance?Previous Studies - Summary

Mink, 2007


The challenge supporting evidence
The Challenge - Program Performance?Supporting Evidence

Gruhl, Werner (1992), Lessons Learned: Cost/Schedule Assessment, Internal Presentation, NASA Comptroller’s Office

Honour, Eric (2004), Understanding the Value of Systems Engineering, Proceedings of the 14th Annual INCOSE International Symposium


Survey development

14 Process Areas Program Performance?

31 Goals

87 Practices

199 Work Products

CMMI-SE/SW/IPPD v1.1

  • 25 Process Areas

  • 179 Goals

  • 614 Practices

  • 476 Work Products

SystemsEngineering-related Filter

Size Constraint Filter

13 Process Areas

23 Goals

45 Practices

71 Work Products

Considered significant to Systems Engineering

Survey Development

Survey content is based on a recognized standard (CMMI)


Survey methodology conducted 2004 2007
Survey Methodology Program Performance?(Conducted: 2004-2007)


Analysis
Analysis Program Performance?

  • Perf = f (PC, PE, SEC, AC)

  • where:Perf = Project Performance PC = Project Challenge PE = Project Environment AC = Acquirer Capability

  • SEC = Systems Engineering Capability

  • SEC can be further decomposed as:

    • Project Planning

    • Project Monitoring and Control

    • Risk Management

    • Requirements Development and Management

    • Technical Solution

      • Trade Studies

      • Product Architecture

    • Product Integration

    • Verification

    • Validation

    • Configuration Management

    • IPT-Based Capability

SE capabilities and analyses are fully defined by mappings of associated survey question responses


Analysis validation of survey responses

Project Challenge (PC) Program Performance?

Overall SE Capability (SEC)

Analysis -Validation of Survey Responses

Acquirer Capability (AC)

Project Performance (Perf)

Analyzed distributions, variability, relationships…

To ensure statistical rigor and relevance


Total se capability sec vs project performance perf
Total SE Capability (SEC) Program Performance?vs. Project Performance (Perf)

Notation

Projects with better Systems Engineering Capabilities deliver better Project Performance (cost, schedule, functionality)


Relating project performance to project challenge and se capability
Relating Project Performance to Program Performance?Project Challenge and SE Capability

Project challenge factors:

  • Life cycle phases

  • Project characteristics (e.g., size, effort, duration, volatility)

  • Technical complexity

  • Teaming relationships

Projects with better Systems Engineering Capabilities are better able to overcome challenging environments


Results 1 product architecture and performance
Results Program Performance?1. Product Architecture and Performance

Projects with better Product Architecture show a

“Moderately Strong / Strong” Positive Relationship with Performance


Results 2 trade studies and project performance
Results Program Performance?2. Trade Studies and Project Performance

Projects with better Trade Studies show a

“Moderately Strong / Strong” Positive Relationship with Performance


Results 3 technical solution and project performance
Results Program Performance?3. Technical Solution and Project Performance

Projects with better Technical Solution show a

“Moderately Strong” Positive Relationshipwith Performance


Results 4 ipt related capability and performance
Results Program Performance?4. IPT-Related Capability and Performance

Projects with better IPTs show a

“Moderately Strong” Positive Relationshipwith Performance


Results 5 requirements and performance
Results Program Performance?5. Requirements and Performance

Projects with better Requirements Development and Management show a

“Moderately Strong” Positive Relationship with Performance


Results summary of process relationships
Results Program Performance?Summary of Process Relationships

Details

Moderately Strong

to Strong Relationship

Moderately Strong

Relationship

Strong Relationship

Weak Relationship


Results summary of relationships composite
Results Program Performance?Summary of Relationships - Composite

Details

Composite Measures

Moderately Strong

to Strong Relationship

Moderately Strong

Relationship

Strong Relationship

Weak Relationship


Results reqts tech solution controlled by project challenge
Results - Reqts + Tech Solution controlled by Project Challenge

Project challenge factors:

  • Life cycle phases

  • Project characteristics (e.g., size, effort, duration, volatility)

  • Technical complexity

  • Teaming relationships

Projects with higher Requirements and Technical Solution capability are better able to achieve higher performance even in challenging programs


Summary
Summary Challenge

SE Effectiveness

  • Provides credible measured evidence about the value of disciplined Systems Engineering

  • Affects success of systems-development projects

    Specific Systems Engineering Best Practices

  • Highest relationships to activities on the “left side of SE Vee”

  • The environment (Project Challenge) affects performance too:

    • Some projects are more challenging than others ... and higher challenge affects performance negatively in spite of better SE

    • Yet good SE practices remain crucial for both high and low challenge projects


Potential next steps
Potential Next Steps Challenge

  • Provide recommendations for action upon survey findings

  • Conduct additional follow-on surveys and analysis of collected data

    • IV&V

    • Broadened sample space

    • Trending

    • Improvements to survey instrument

  • Survey system acquirers


Dod systemic root cause analysis why do projects fail
DoD Systemic Root Cause Analysis Challenge- Why do projects fail?

  • Root causes from DoD analysis of program performance issues appear consistent with NDIA SE survey findings.

  • Reference:

  • Systemic Root Cause Analysis,

  • Dave Castellano, Deputy Director Assessments & Support, OUSD(A&T)

  • NDIA Systems Engineering Conference, 2007

  • and NDIA SE Division Annual Planning Meeting


Acknowledgements
Acknowledgements Challenge


Se effectiveness points of contact
SE Effectiveness ChallengePoints of Contact

Al Brown [email protected]

Geoff Draper [email protected]

Joe Elm [email protected]

Dennis Goldenson [email protected]

Al Mink [email protected]

Ken Ptack [email protected]

Mike [email protected]


Backup

Backup Challenge

NDIA SE Effectiveness Survey

Analysis Slides


Conclusions caveats consistent with top 10 reasons projects fail
Conclusions & Caveats ChallengeConsistent with “Top 10 Reasons Projects Fail*”

  • Lack of user involvement

  • Changing requirements

  • Inadequate Specifications

  • Unrealistic project estimates

  • Poor project management

  • Management change control

  • Inexperienced personnel

  • Expectations not properly set

  • Subcontractor failure

  • Poor architectural design

Above Items Can Cause Overall Program Cost and Schedule to Overrun

* Project Management Institute Matching items noted in RED


Conclusions caveats consistent with top 5 se issues 2006
Conclusions & Caveats ChallengeConsistent with “Top 5 SE Issues*” (2006)

  • Keysystems engineering practicesknown to be effective arenot consistently appliedacross all phases of the program life cycle.

  • Insufficient systems engineering is applied earlyin the program life cycle, compromising the foundation for initial requirements and architecture development.

  • Requirements are not always well-managed, including the effective translationfrom capabilities statementsinto executable requirements to achieve successful acquisition programs.

  • The quantity and quality ofsystems engineering expertise is insufficientto meet the demands of the government and the defense industry.

  • Collaborative environments, includingSE tools, are inadequateto effectively execute SE at the joint capability, system of systems, and system levels.

* OUSD AT&L Summit Matching items noted in RED


Summary se relationships to project performance
Summary SE Relationships Challengeto Project Performance

Details


Summary se relationships to project performance1
Summary SE Relationships Challengeto Project Performance

Details

Highest scoring SE capability areas in Higher Performing Projects*:

Risk Management; Requirements Development and Management; IPTs

*Based on small partitioned sample size

Lowest scoring SE capability areas in Lower Performing Projects*:

Validation; Architecture; Requirements Development and Management


Terminology and notation distribution graph
Terminology and Notation ChallengeDistribution Graph

Histogram of

response

frequencies

Median

Interquartile

Range

Outliers

Sample size

(responses to corresponding survey questions)

Data

Range


Terminology and notation mosaic chart
Terminology and Notation ChallengeMosaic Chart

Column width represents proportion of projects with this

level of capability

Relative performance distribution of the sample

Gamma: measures strength of relationship between two ordinal variables

p: probability that an associative relationship would be observed by chance alone

Projects exhibiting a given level of relative capability

(Lowest, Intermediate, Highest)

Sample size and distribution for associated survey responses

(capability + performance)

Measures of association and statistical test


Se capability product architecture arch
SE Capability: ChallengeProduct Architecture (ARCH)


Se capability product architecture arch1
SE Capability: ChallengeProduct Architecture (ARCH)

Survey Questions


Se capability configuration management cm
SE Capability: ChallengeConfiguration Management (CM)


Se capability configuration management cm1
SE Capability: ChallengeConfiguration Management (CM)

Survey Questions


Se capability ipt related capability ipt
SE Capability: ChallengeIPT-Related Capability (IPT)


Se capability ipt related capability ipt1
SE Capability: ChallengeIPT-Related Capability (IPT)

Survey Questions


Se capability product integration pi
SE Capability: ChallengeProduct Integration (PI)


Se capability product integration pi1
SE Capability: ChallengeProduct Integration (PI)

Survey Question


Se capability project monitoring and control pmc
SE Capability: ChallengeProject Monitoring and Control (PMC)


Se capability project monitoring and control pmc1
SE Capability: ChallengeProject Monitoring and Control (PMC)

Survey Questions (Part 1)


Se capability project monitoring and control pmc2
SE Capability: ChallengeProject Monitoring and Control (PMC)

Survey Questions (Part 2)


Se capability project planning pp
SE Capability: ChallengeProject Planning (PP)


Se capability project planning pp1
SE Capability: ChallengeProject Planning (PP)

Survey Questions (Part 1)


Se capability project planning pp2
SE Capability: ChallengeProject Planning (PP)

Survey Questions (Part 2)


Se capability project planning pp3
SE Capability: ChallengeProject Planning (PP)

Survey Questions (Part 3)


Se capability requirements development mgmt req
SE Capability: ChallengeRequirements Development & Mgmt (REQ)


Se capability requirements development mgmt req1
SE Capability: ChallengeRequirements Development & Mgmt (REQ)

Survey Questions (Part 1)


Se capability requirements development mgmt req2
SE Capability: ChallengeRequirements Development & Mgmt (REQ)

Survey Questions (Part 2)


Se capability risk management rskm
SE Capability: ChallengeRisk Management (RSKM)


Se capability risk management rskm1
SE Capability: ChallengeRisk Management (RSKM)

Survey Questions


Se capability trade studies trade
SE Capability: ChallengeTrade Studies (TRADE)


Se capability trade studies trade1
SE Capability: ChallengeTrade Studies (TRADE)

Survey Questions


Se capability technical solution ts
SE Capability: ChallengeTechnical Solution (TS)

Note: TS is a composite measure equivalent to ARCH + TRADE.


Se capability technical solution ts1
SE Capability: ChallengeTechnical Solution (TS)

Survey Questions (Part 1)


Se capability technical solution ts2
SE Capability: ChallengeTechnical Solution (TS)

Survey Questions (Part 2)


Se capability validation val
SE Capability: ChallengeValidation (VAL)


Se capability validation val1
SE Capability: ChallengeValidation (VAL)

Survey Questions


Se capability verification ver
SE Capability: ChallengeVerification (VER)


Se capability verification ver1
SE Capability: ChallengeVerification (VER)

Survey Questions (Part 1)


Se capability verification ver2
SE Capability: ChallengeVerification (VER)

Survey Questions (Part 2)


Se capability combined reqts tech solution req ts
SE Capability: ChallengeCombined Reqts+Tech Solution (REQ+TS)

(This is a higher order measure; see base measures for distribution)


Se capability total systems engineering capability
SE Capability: ChallengeTotal Systems Engineering Capability


Project challenge pc

Project challenge factors: Challenge

Life cycle phases

Project characteristics (e.g., size, effort, duration, volatility)

Technical complexity

Teaming relationships

Project Challenge (PC)


Se capability reqts tech solution with project challenge
SE Capability: ChallengeReqts+Tech Solution with Project Challenge

Project challenge factors:

  • Life cycle phases

  • Project characteristics (e.g., size, effort, duration, volatility)

  • Technical complexity

  • Teaming relationships


Relating project performance to project challenge and se capability1
Relating Project Performance to ChallengeProject Challenge and SE Capability


Reqts tech solution project challenge and performance
Reqts + Tech Solution + Project Challenge and Performance Challenge

Project challenge factors:

  • Life cycle phases

  • Project characteristics (e.g., size, effort, duration, volatility)

  • Technical complexity

  • Teaming relationships


Se effectiveness methodology in detail

SEEC Activities Challenge

Identify industry members’ focals

Contact focals, brief the survey process, solicit support

Provide Web access data to focals

Focal contact #1 to expedite response

Focal contact #2 to expedite response

Report* findings to NDIA and OSD

NDIA SED active roster

NDIA mg’t input

Company Focal

Activities

Solicit respondents and provide Web site access info

Identify respondents and report number to SEI

Respondent contact #1 to expedite response

Respondent contact #2 to expedite response

Report number of responses provided to SEI

Respondent Activities

Complete questionnaire and submit to SEI

Report completion to focal

SEI Activities

Collect responses and response rate data

Analyze data and report to SEEC

SE EffectivenessMethodology (In Detail)



ad