1 / 25

Software Engineering in the DMO: Where to from here?

Software Engineering in the DMO: Where to from here?. Matt Ashford A/Director of Software Engineering Electronic Systems Division (ESD). Opening Assertions. Defence systems are getting more connected and increasingly software intensive, and software is the critical element

sinjin
Download Presentation

Software Engineering in the DMO: Where to from here?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Software Engineering in the DMO: Where to from here? Matt Ashford A/Director of Software Engineering Electronic Systems Division (ESD)

  2. Opening Assertions • Defence systems are getting more connected and increasingly software intensive, and software is the critical element • Engineering is fundamental to DMO’s core business, and is essential to successfully acquire, develop and sustain complex Defence systems In my opinion… • The DMO can support its’ software engineers and improve its’ software engineering capability through: • Policy, Procedure & Guidance • Training • Knowledge management & collaboration • Peer networks and communication • Career structures • Functional experts (Grey Beards), mentoring, coaching • Systems & Software Engineering related research & development Across SPO’s, Branches, Divisions and Groups.

  3. Directorate of Software Engineering (DSwE) • Mission: • To sustain and improve ESD Software Engineering (SwE) capability and performance • Objectives: • Improve ESD SwE capability via: • ESD Policy, Procedure and Guidance • Knowledge sharing and collaboration • SwE related research & development • Improve ESD SwE performance via: • Direct SwE support to ESD projects and SPOs • Training • Networks and communication • Represent ESD SwE: • ESD Continuous Improvement Group • Across DMO Divisions • DMO Materiel Engineering Council (MEC) • ADF Technical Regulatory Agencies • External groups

  4. Proposed Work Plan (08/09) • Policy, Procedure & Guidance • State-of-the-Practice Survey • Software Configuration Management • Training • DAU Software-Intensive Systems Acquisition Management (SiSAM) Training • Peer Networks & Knowledge Sharing • Systems & Software Assurance • Collaboration & Research • Systems and Software Engineering Research Capability

  5. State-of-the-Practice Survey • Need to understand current state-of-the-practice of software engineering in ESD, e.g.: • How much and what type of software does ESD acquire & sustain? • Who is doing what software related activities (SPOs, People)? • How well are the current software Policies, Procedures and Guidance being implemented? • Software lists, PSM etc • What are the biggest issues for our software practitioners? • Survey will: • Collect data • Identify improvement opportunities • Drive future DSwE initiatives • Inform future research agenda

  6. Software Configuration Management • SCM identified as an immediate, systemic issue • Especially for software-intensive support environments • Broad, high level CM Policy available • Lack of detailed implementation guidance and tool support • Multiple software applications • Different versions across multiple deployment environments • Mixture of COTS and bespoke • SCM task to work with most affected SPO’s to: • Review current Defence & DMO Policy, Procedure & Guidance • Identify SPO SCM needs • Develop implementation level plans & procedures • Develop & roll out ESD SCM Policy, Procedure &/or Guidance • SCM stakeholder group / special interest group? • Contact Matt Ashford if you want to be involved

  7. Software Training • Most DSWAR initiated training now discontinued • Australian Software Professional Development Program (ASPDP) • ISO 12207 • Practical System and Software Measurement • SiSAM • DMO Institute • Plethora of Project Management and Logistics training • Very little software related training • DAU Software-Intensive Systems Acquisition Management (SiSAM) training • Planned for Dec 08 • Training Needs Analysis required to develop comprehensive curriculum

  8. Peer Networks & Knowledge Sharing • Few (if any?) technically focused DMO peer networks • Are you aware of the “DMO Forum”? • DMO Intranet Home → Tools and Resources → DMO Forum • Very few collaborative knowledge sharing environments (e.g. Wiki) …and even fewer opportunities to interact • Rely on external Centre’s of Excellence • E.g. DGTA SCI • Sometimes peer level support would suffice • We have a lot of smart people • A lot of problems have probably already been solved! • Pockets of excellence already exist in DMO • Just need to find them (when you need them)

  9. System and Software Assurance • A definition • The level of confidence or trust that systems are free from vulnerabilities, either intentionally designed into the system or accidentally inserted at any time during its lifecycle and functions in the intended manner (nothing more or nothing less) even in the face of malicious activity. • Exploitable vulnerabilities are an increasing concern: • Outsourcing and Globalisation • Prevalence of COTS, Open Source Software (OSS) and other Software of Unknown Pedigree (SOUP) • 2 types of problems • Unintentional defects or vulnerabilities • Majority of safety space • Intentional, malicious attack or insertion (small minority) • Includes the malicious use of pre-existing (innocent) vulnerabilities

  10. System and Software Assurance • The Problem: • Current systems and software engineering and security accreditation processes and techniques may not be adequate to assure the security of modern software-intensive Defence systems • (Perceived) lack of policy, procedure and guidance to address SA over the full system lifecycle • (Perceived) immaturity of SA related tools, technology and methodologies • (Perceived) lack of awareness of potential threats and mitigation strategies (especially with respect to systems operating within the deployed environment) • The multiple stakeholders (including authorities) with poorly defined boundaries (especially with respect to systems operating within the deployed environment) • Defence is not able to confidently state or defend the level of System Assurance (with limited exception) in products, systems or capability currently in acquisition, sustainment or in-service within the deployed environment. • Defence is not able to have a sufficient level of confidence in procured equipment, systems and capabilities with respect to hosting undocumented features.

  11. System and Software Assurance • Technical Regulatory Authorities concerned with item’s fitness for service, safety and compliance with regulations for environmental protection. • Lots of focus on safety: relatively mature discipline • System Assurance less mature • Awareness of potential threats, vulnerabilities and mitigation strategies is poor • Lack of coherent policy and regulatory environment • Fixed Vs Deployed • White Vs Black • Immature methodologies • How often does DMO require Vulnerability Assessments or Secure Coding Practices? • Collaborating with US and UK

  12. Is this different to System Safety? • Similar methodologies / processes • Different perspectives • Safety and System Assurance Cases both contain • Claims about a product or service (e.g. no exploitable buffer overflows) • Arguments (e.g. use static tool analysis) • Evidence (e.g. test reports) • Both reduce uncertainty, but not to zero • Can be expensive, but best done from the beginning, retrospective fixes are very expensive • Everyone happy to share best practice with safety • Do not always want to publicly share best practice for system and software assurance • Opportunity to leverage (or extend) extant safety practices

  13. A RPDE Task is: A method to formally engage industry to investigate a Defence problem Around 12 -18 months duration Rapid Prototype, Development & Evaluation • RPDE is: • a collaborative venture between the Defence and industry RPDE Steering Gate Process RPDE Operating Model www.rpde.org.au

  14. 08 May 08 12 Aug 08 RPDE Task 24 – System Assurance • DMO has sponsored a RDPE Task to investigate SA problem • CIOG now co-sponsor • Question: How can Defence ensure that deployable systems and assets have an appropriate level of understood and quantifiable confidence that the systems are assured and remain so and thus will continue to perform in the manner expected even in the face of malice? • Current status: • Contact Matt Ashford for more info or if you want to get involved

  15. Systems & Software Collaboration / Research MoD DE&S D-SET Government DoD DMO HENG DoD AT&L DSSE SISAIG

  16. US/UK/AUS Software-Intensive Systems Acquisition Improvement Group (SISAIG) • Established circa 2003 • Facilitates information sharing and collaboration across US/UK/AUS Governments • Leverage each Nations’ initiatives, R&D, etc • Aus Sponsor: Ms Shireane McKinnie • Aus Lead: Mr David Marshall • Current Work Streams: • Systems Assurance • Systems of Systems (SoS) / Network-Centric Warfare (NCW) • Software Estimation & Earned Value Management (EVM)

  17. Systems & Software Collaboration / Research MoD DE&S D-SET Government DoD DMO HENG DoD AT&L DSSE SISAIG Software Research

  18. DMO Systems and Software Engineering Independent Advisory Panel • Established by DMO Head Engineering (Shireane McKinnie) to: • Provide strategic guidance on future trends and challenges for the DMO • Independent expert advice on emerging Systems & Software Engineering issues • Identification of key developmental / emerging technologies • Facilitate technology transfer to the DMO • E.g. NICTA, ACCS, CSSE, SEI, UK SSEI etc • Linkages to academic and research programs • 5 “Core” Members + 2 “Consultative” members

  19. DMO SESW Independent Advisory Panel – Core Members Prof. Geoff Dromey Foundation Professor of Software Engineering, Griffith University Director, Software Quality Institute ARC Centre for Complex Systems Dr Clive BoughtonSenior Lecturer, Australian National University Prof. Ross Jeffery Professor of Software Engineering, UNSW Program Leader, Empirical Software Engineering Research Program National ICT Australia (NICTA) Prof. Peter Lindsay Boeing Professor of Systems Engineering, Uni of Queensland ARC Centre for Complex Systems Prof. Stephen Cook Director, Defence & Systems Institute Director of Centre of Expertise in Systems Integration University of South Australia

  20. Prof. Barry Boehm TRW Professor of Software Engineering, University of Southern California (USC) Director, USC Center for Systems and Software Engineering. Prof. John McDermid Professor of Software Engineering, University of York Leader of the High Integrity Systems Engineering Group (HISE) Member of the UK MoD Defence Scientific Advisory Council (DSAC) DMO SE and SW Independent Advisory Panel – Consultative Members

  21. SEI Australia Initiative • Initiative to establish an SEI research facility in Australia • Co-funded by SA Govt and CoA (through SADI) • Outcome of failed negotiations was a recommendation to explore alternate strategies to establish indigenous capability

  22. SSE Research Proposal • Premise: • There is a need for greater sponsorship and coordination of Systems & Software Engineering research to improve the acquisition, development and sustainment of Defence systems. • SEI Australia & SESW Panel shows there is a willingness to support such a concept • Consultative workshop conducted 14/15 Aug 08 in Canberra • To elicit information to help develop a proposal to establish a cost effective, indigenous Defence-related systems and software engineering research capability. • Key questions: • What are Defence’s research interests? • What research capability is already available (Domestic / International)? • What models / contract mechanisms can we use? • What are their strengths & weaknesses? • What commercial aspects do we need to address?

  23. Workshop Attendees

  24. Outcomes of Workshop • Summarise key outputs of workshop and way ahead.

  25. Questions? Matt Ashford A/Director Software Engineering Electronic Systems Division (ESD) R3-3-121Russell OfficesCanberra ACT 2600Ph: (02) 6266 7054matt.ashford@defence.gov.au Don’t forget the survey!

More Related