1 / 22

SIMULATION AS A SERVICE

USE FOR CAPABILITY FOCUSSED PRESENTATIONS. TITLE SLIDE DATE AUTOMATICALLY UPDATES. UNCLASSIFIED. SIMULATION AS A SERVICE. Australian Defence Simulation And Training Centre. Mr Max Britton Deputy Director System Development Australian Defence Simulation And Training Centre.

karis
Download Presentation

SIMULATION AS A SERVICE

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. USE FOR CAPABILITY FOCUSSED PRESENTATIONS TITLE SLIDE DATE AUTOMATICALLY UPDATES UNCLASSIFIED SIMULATION AS A SERVICE Australian Defence Simulation And Training Centre Mr Max Britton Deputy Director System Development Australian Defence Simulation And Training Centre TITLE SLIDE DATE AUTOMATICALLY UPDATES

  2. Think about the following • Persistent • Repeatable • Agile • And, • Defendable

  3. In 1964 the RAN had a distributed multi ship training system called ARGIS This was an electronic analogue system Trained Command Teams and Warfare officers in the multi ship and single ship environment Based on DDG, DE and Aircraft Carrier Opps rooms (HMAS Melbourne) The Journey

  4. The Journey Continued • This was the start of the simulation training path within the distributed world • Provided a common and consistent training outcome that was measureable and repeatable • Allowed Command teams to train as they fought • Enabled tactics and doctrine to be explored

  5. The Dark Ages • Ship Platforms changed and the training system did not replicate the new capabilities • FFG class Ship introduced in the early 1980’s so new training system required • Ongoing need to train DDG class Command Teams

  6. The Next Generation • The IOTTF (Integrated Operation Team Trainer) developed to support FFG and DDG CTT in the early 1990’s • Again distributed within it’s own propriety environment • This is about the time I started to take an interest in the simulation world • Add to the complication ANZAC class FFH

  7. Next Generation Continued • ANZAC class CTT training system deployed at HMAS Watson in 1994 • No distributed environment or gateway • No ability to train in a 3 ship collective environment • No common data sets or models • Training outcomes challenging

  8. Distributed Training seen as a priority again • Project SEA 1412 stood up to provide a multi ship CTT training environment • Used DIS as the enabler for interaction between IOTTF and ANZAC CSTT simulators • Included Digital accrual voice system and simulated TDL • Back to the 60’s and 70’s capabilities

  9. Coalition Interoperability • CREAMS program stood up between USN and RAN in early 2000 • Coalition Fleet training established using distributed systems • Used ISDN lines and covered data using early bulk encryption units • Add hock development • Designed as event based Training

  10. The Problem Space • Data sets different between Simulators • Geospatial data inconsistent • Models not consistent • Interoperability compromised • Training value diminished without human intervention • Fair Fight principles less certian

  11. Other ADF Capabilities • Army have significant simulation assets • Tools such as JSAF, ONESAF • User of VBS, Steel Beasts and vehicle trainers • Mature distributed LVC environment • BSC’s situated at major bases • Exercises designed as event based activities

  12. Other ADF Capabilities • Air force has significant flight simulation trainers • Full mission and part task training systems • Air Control simulations • Less use of distributed simulation • Generally user to train individuals and not teams

  13. Other ADF capabilities • DSTO is a significant user of distributed simulation for experimentation • LVC tools used as part of normal business • Numerous battle labs • JDSC in Canberra • Heavy support to ADF in the training world • Highly skilled workforce

  14. The joint world • ADSTC formed taking ADSO , JCTC and J7 groups to form the one organisation • Heavy user of distributed simulation • Major role is to support Joint training outcomes • Coalition training • Supports single service exercises • Highly skilled workforce

  15. The problem space • Data not reused • Data sets not common and consistent • Geospatial terrain not always available • Often not in the correct format • Expensive assets not fully exploited • Add hock development of simulation • Full of stove pipe solutions • Inconsistent use of standards

  16. The problem space continued • Limited distributed access to data • Buying or re building the same data time and time again • Not using or exploiting the available resources • Reduced training outcomes • Taxpayer not getting best value for money

  17. What do we do to fix the problem? • Architecturally design the Simulation environment • Defence Simulation Architecture being developed as part of the Defence Enterprise Architecture • Standards based solution • Enterprise level coordination of resources • Reuse of data

  18. Fixing the problem continued • Engagement of the stakeholder • Knowledge management • Data catalogues and repositories • Common and consistent objects and entities • Seamless distributed networks • Enterprise level licenced applications that are in the distributed environment

  19. Fixing the problem continued • Up skilling our people through distributed e-learning • Coordinated reuse of data • Translation services

  20. Getting to where we need to be • Simulation should be a seamless service • JP 3035 will provide the glue • ADSTC will coordinate the resources • DSA will continue to evolve over time to refine and gap analyse the capability

  21. Simulation as a service should be • Persistent • Repeatable • Agile • seamless • And, • Defendable

  22. Questions?

More Related