1 / 35

Software Aspects of Strategic Defense Systems

Software Aspects of Strategic Defense Systems. Team Turkey Joe Kim Senthil Smitha. President Reagan’s SDI. In March 1983, President Reagan called for a “Strategic Defense Initiative” (SDI).

schuyler
Download Presentation

Software Aspects of Strategic Defense Systems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Software Aspects of Strategic Defense Systems Team Turkey Joe Kim Senthil Smitha

  2. President Reagan’s SDI In March 1983, President Reagan called for a “Strategic Defense Initiative” (SDI). “I call upon the scientific community … to give us the means of rendering these nuclear weapons impotent and obsolete.” The SDI program came to be popularly called “Star Wars.”

  3. Parnas’ Background • Doesn’t object to weapons development in general • 8yrs experience working on military aircraft • 20yrs in the software engineering field • June 28, 1985 - Parnas resigns from the $1,000 / day panel

  4. Reasons for resignation • Software is unreliable • Unattainable goal due to SDI properties • Software techniques inadequate • SE improvements will be insufficient • AI and Automatic Programming won’t help • Problems with Proofs • Research is inefficient and ineffective

  5. Software is Unreliable • Software often produced with “bugs” • Problems persist for several versions and sometimes worsen with upgrades • Digital computers have large # of states but made from redundant subsystems (which can be exhaustively tested but the whole system can’t) • # of possible states too high in Software • Functions describing their behavior not continuous and can’t be mathematically verified • Logical expressions often harder to understand than the program itself • Most Programmers don’tknow the tools of the trade

  6. Can’t Trust that SDI Character • Target and decoys have unknown characteristics, (need to identify, track, and direct weapons towards them) • Fatal errors will occur if developed without knowledge of characteristics or with characteristics that can be changed by an attacker on day of battle • Attackers countermeasures make network of sensors and weapons unreliable • Fail-soft only successful when: failures predicted from past history, component failures unlikely and statically independent, system has excess capacity, real-time deadlines can be missed • None true for SDI system • Impossible to test under real conditions • No faith without extensive tests

  7. Most massive, costly software ever attempted • Service period too short for humans to debug and modify programs • Debugger’s notes on army truck in Vietnam – not possible in 30-90 minute war • Real-time computation deadlines – worst case amount of resources can’t be predicted • Efficiency and predictability require some preruntime scheduling, need worse case real-time schedule • Large variety of sensors and weapons each requiring complex software, suite will grow during development and after deployment (subject to independent modification) • Difficulties increase with: size of the system, # of independent subsystems, and # of interfaces

  8. One Shot at the Title • Flow chart approach – “think like a computer” • Improved with larger steps • Leads to confusion as data has different meaning under different circumstances • Concurrency – program appear to be doing more than one thing at a time • Multiprocessing – program DOES more than one thing at a time • Yes, Professional Programmers use this conventional approach • Trial and Error – software released when rate of finding new errors slows down

  9. New SE Techniques • Research aimed at reducing amount of information needed to test and maintain • Structured programming and formal program semantics • Use of formally specified abstract interfaces (information hiding) • Use of cooperating sequential processes • Gap between theory and practice • Good software engineering can be done, it’s just far from easy • It reduces, NOT ELIMINATES, errors thus there is still a need testing

  10. Improvements in SE • New languages and environments will help but they are not a major impediment to our work. • AI makes big claims but can offer no help • Automatic Programming is just a euphemism for programming in a higher-level language. • Still need to specify an algorithm • No breakthroughs • The fault lies not in our tools but in ourselves and in the nature of our product.

  11. Artificial Flowers and Intelligence • AI-1 - Solving problems which previously could only be done with human intelligence • This definition changes over time • Best work in this area makes no attempt to mimic people’s problem solving techniques • Mostly problem specific, requires abstraction and creativity to transfer the work • AI-2 - Heuristic or Rule Based Programming/Expert Systems • Approach is dangerous and misleading • Rules obtained are inconsistent, incomplete, and inaccurate • Evolutionary approach results in poorly understood behavior which is hard to predict • Spectacular behavior on small # of obvious cases

  12. Prove it • Can’t use exhaustive case analysis • No prolonged, realistic, testing • Use Mathematical analysis • Don’t have exact specifications to which one can apply a proof • Proofs themselves may contain errors • Concurrency adds difficulty to proofs • No techniques to prove programs robust enough to operate with unknown hardware failures or input errors

  13. Getting what you’ve paid for • Those who make purchasing decisions don’t know what they’re buying • Most difficult and crucial step in research is to identify and define the problem • Practical considerations restrict important theoretical problems • Research should be judge by teams of successful researchers and experienced system engineers • These people considered to valuable to spend time reviewing proposals

  14. Some other perspectives on SDI A Debate on the feasibility of SDI was sponsored by CSPR & MIT in 1985 The debate was moderated by Michael L. Dertouzos PhD '64 of MIT Parnas and Joseph Weizenbaum of MITagainst SDI Charles L. Seitz '65 of Caltech and Danny Cohen of the University of Southern California USC spoke in for SDI. Parnas presented his argument based on the papers he has submitted to SDIO at the time of is resignation from the panel.

  15. Parnas’ Argument Since: • Specifications not known in advance, • Realistic testing is not possible, • No chance to fix software during use, • No foreseeable technology changes this, Therefore – It is not possible to construct SDI software that you could trust to work.

  16. Seitz’ Argument Since • A hierarchical architecture seems best, (because more natural, used in nature, understood by military, allows abstraction up levels …) • Physical organization should follow logical organization, (simplest choice, natural) • Tradeoffs to make software problem tractable are in the choice of system architecture (not in new / radical methods) this makes software problems tractable.

  17. Seitz’ Argument • Loose coordination allows us to infer system performance (assume stat. independence, …) allows system reliability estimates. Therefore it is possible to create reliable SDI battle management software.

  18. From the debate…… • Parnas says “We can’t test it” • Seitz then replies “We can build it.” • Cohen mentions the space shuttle as an example of a system requiring large and complex software. • Parnas’ response is that whereas NASA can delay a launch up until the last second, the president cannot call up the USSR to delay a nuclear war.

  19. From the debate…… • Seitz argues that SDI will be much better than the existing ABM systems. In essencnce he says something useful could be built but doesnt really address the issue of testing it. • Parnas argues that it doesn’t make any difference what is built or how it is built, because there won’t be any means of testing that it meets requirements. • While people for SDI keep coming with arguments to support SDI, they fail to provide anwers to the specific issues raised by Parnas

  20. Patriot Missile performance in the Gulf war • The Patriot system has 7.4 ft long missile powered by a single stage solid propellant rocket motor that runs at mach 3 speeds • The missile weighs 2200 lbs and its range is 43 miles • The patriot is armed with a 200 lb high explosive war head detonated by a proximity fuse that causes shrapnel to destroy the intended target • The system is built around radar and fast computers

  21. Operation: • The missile is launched and guided to the target in three phases: • First, the missiles guidance system turns the patriot toward the incoming missile as the missile flies into the Patriot’s radar beam • Then the Patriot’s computer guides the missile toward the incoming scud missile • Finally, the patriot Missile’s internal radar receiver guides it toward the interception of the incoming missile • During the Gulf war the Patriot was assigned to shoot down incoming Iraqi Scud or Al-Hussein missiles launched at Israel and Saudi Arabia

  22. Statistical analysis of the Patriot’s performance during the Gulf war: • The U.S. Army which was in charge of the Patriot claimed an initial success rate of 80% in Saudi Arabia and 50% in Israel • Those claims were scaled back to 70% and 40% respectively • Part of the reason the success rate was 30% higher in Saudi Arabia than in Israel is that in Saudi Arabia the patriots merely had to push the incoming scud missile away from military targets in the desert or disable the war head • In Israel the scuds were aimed directly at cities and civilian populations (Lager targets)

  23. …Analysis continued • The Patriot’s success rate in Israel was examined by the Israel Defense Forces (I.D.F) • The IDF counted any scud that exploded on the ground (regardless of whether or not it was diverted) as a failure of the patriot • A 10 month investigation by the House Government Operations subcommittee on legislation and national security concluded that there was little evidence to prove that the Patriot hit more than a few Scuds

  24. Patriot missile software problem • As reported by the U.S. General Accounting office, On 02/25/1991, a Patriot failed to track and intercept a Scud missile because of a software problem in the system’s weapons control computer, the scud subsequently hit an Army barracks, killing 28 Americans • This problem led to inaccurate tracking calculation that became worse the longer the system operated • The patriot had never before been used to defend against Scud missiles nor was it expected to operate continuously for long periods of time

  25. A look at current missile defense scenario • Some dreams never die. Do they? • SDI , which was envisioned by President Regan continues to live. • The concept of missile defense remains the same but the bounds of the dream keeps changing. • This can be attributed to the change in the sophistication and the geographical location of the hypothesized enemy.

  26. A look at current missile defense scenario • In the early 90’s SDI gets reincarnated, but this time with a new name “ BMD” • BMDO unlike SDIO has a string of projects with relatively smaller goals. • The projects under BMDO can be classified broadly under these categories, Terminal Defense Midcourse Defense Segment  Boost Defense Segment

  27. A look at current missile defense scenario

  28. A look at current missile defense scenario • Further classification of these categories • Terminal Defense Segment THAAD, NTMD, PATRIOT – PAC3, etc.. • Midcourse Defense Segment NMD – GMD, SMD, etc… • Boost Defense Segment Airborne Laser, Space Based Laser, etc ..

  29. A look at current missile defense scenario • Well how is the “BMD” Doing? • An estimated amount of 100 billion dollars have been spent on Missile defense. • The goals of each of the subsystem is small compared to SDI due to the current scenarios • “This is a sharp change from the Reagan years, perhaps because the technology used is closer at hand and the threats are smaller.” (Mosher, page 39, IEEE Spectrum, 1997)

  30. A look at current missile defense scenario • Smaller anticipated mission: “protect the U.S. … against an attack by a rogue state using a handful of warheads outfitted with … simple countermeasures.” “also provide protection against an accidental launch of a few warheads by Russia or China.” “… no more than 100 hit-to-kill interceptors based at old ABM site near Grand Forks, ND.” (Mosher, page 37, IEEE Spectrum, 1997)

  31. A look at current missile defense scenario • How do these smaller anticipated missions affect Parnas’s argument about SDI wont be able to produce a trustworthy missile defense software? • Fundamentally not as you can see from the “Test” facts below, • “In the last 15 years, the U.S. has conducted 20 hit-to-kill intercepts, …. Six intercepts were successful; 13 of those test were done in the last five years, and among them three succeeded.”

  32. Test Facts …… • “No real attempts have been made to intercept uncooperative targets – those that make use of clutter, decoys, maneuver, anti-simulation, and other countermeasures.” (Mosher, page 39, IEEE Spectrum, 1997) • In 1996, ex TRW engineer Nira Schwartz filed a “False Claims Act” suit, alleging that results of tests to distinguish warheads and decoys were falsified by TRW. (featured on “60 Minutes II” in January 2001)

  33. Test Facts …… • Lt. General Kadish – “Right now, from what I see, there is no reason to believe that we can’t make this work. But there’s a lot more testing to be done.” • Secretary of Defense Donald Rumsfeld said, “We are going to deploy a minimal Missile Defense System, in the near future even if the system has not been tested completely.”

  34. Conclusions • A trustworthy SDI Software seems highly impossible. • The arguments by others supporting SDI doesn’t seem to answer the issues raised by Parnas. • The newer scenarios of missile defense does not change Parnas’s argument fundamentally • The systems for limited mission seems to be more tractable than SDI

  35. References: • http://www.cse.nd.edu/~kwb/nsf-ufe/star-wars/ • Broad, W.J., "Scientist at work: Philip E. Coyle III; words of caution on missile defense", New York Times, January 16, 2001. • DOD Ballistic Missile Defense Organization (BMDO). Web site http://www.acq.osd.mil/bmdo/ • http://www.clw.org/nmd/bmdfuzzylogic.html

More Related