Implications of complexity research for command and control
Download
1 / 54

Implications of Complexity Research for Command and Control - PowerPoint PPT Presentation


  • 113 Views
  • Uploaded on

Implications of Complexity Research for Command and Control. M. I. Bell FACT, 29 July 2009. Disclaimers. Most of these ideas are not original; I will not acknowledge my sources I am responsible for any errors; feel free to point them out Complexity can be complicated, even complex

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Implications of Complexity Research for Command and Control' - adamdaniel


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

Disclaimers
Disclaimers

  • Most of these ideas are not original; I will not acknowledge my sources

  • I am responsible for any errors; feel free to point them out

  • Complexity can be complicated, even complex

  • I get nothing from the admission charge; no refunds will be given


Beware of humpty dumpty
Beware of Humpty Dumpty

  • Care is required when using everyday words for specialized purposes

  • The community of interest needs clear, common definition

  • The general public needs warnings to avoid confusion

“When I use a word,” Humpty Dumpty said, in rather a scornful tone, “it means just what I choose it to mean – neither more nor less.”

“The question is,” said Alice, “whether you can make words mean so many different things.”

“The question is,” said Humpty Dumpty, “which is to be master – that's all.”


Outline
Outline

  • Motivation

  • Some trivial questions (not answers)

  • Intuitive complexity

  • Quantifying complexity

  • Formal complexity: dynamic and architectural

  • Design and control of complex systems

  • Complexity and C2


Motivation
Motivation

  • Complexity as a buzzword

    • “Six degrees of separation,” “butterfly effect,” etc. have entered popular culture

    • Dozens of university groups, programs, seminars, and projects

    • Pioneers (e.g., Santa Fe Institute) considering moving on

  • Complexity as a metaphor

    • 98 of 144 papers in the 14th ICCRTS contain the word “complexity”

  • Complexity as a mindset

    • Awareness of chaos, “fat tails,” “tipping points,” self-organization

  • Complexity as a toolbox

    • Fractal geometry, nonlinear dynamics, agent-based simulation

  • Complexity as a paradigm

    “accepted examples of actual scientific practice… [that] provide models from which spring particular coherent traditions of scientific research”

    – T. S. Kuhn, The Structure of Scientific Revolutions, 1962


What is complexity
What is Complexity?

  • Many entities, many interactions, collective behavior

  • Quality or quantity?

  • Definition or characteristics?

    • Emergence, self-organization, self-similarity, chaos, etc.

  • Computational complexity (of a problem)

    • Resources (typically time) required to obtain a solution

  • Algorithmic information content (of a string)

    • Length of the shortest program that will output the string

  • Structural complexity

    • Self-similarity, fractal geometry

  • Dynamic complexity

    • Chaos, sensitivity to initial conditions, phase transformations


Why are things complex
Why are Things Complex?

  • By selection or by design

  • Selection

    • Natural or artificial (often not “survival of the fittest” but “the survivors are the fittest”)

    • Preferential growth (“the rich get richer”)

  • Design

    • Nonlinearity

    • Feedback control

    • Optimization


Why do we care
Why Do We Care?

  • Emergent behavior (self-organization)

  • Requisite variety (control)

  • Causality (prediction)

  • Stability/instability (cascading failure)

  • Unintended consequences


Intuitive complexity
Intuitive Complexity

  • Disorganized complexity

    “a problem in which the number of variables is very large, and one in which each of the many variables has a behavior which is individually erratic, or perhaps totally unknown. However, …the system as a whole possesses certain orderly and analyzable average properties”

  • Organized complexity

    “problems which involve dealing simultaneously with a sizable number of factors which are interrelated into an organic whole”

    – W. Weaver, American Scientist (1948)


Complexity vs order
Complexity vs. Order

Organized/Differentiated

Entities

Simple Entities

Statistical

Analysis

Systems

Analysis

PHYSICS

Pressure

Temperature

Phase

ECONOMICS

GDP

Growth rate


Butterfly effect
Butterfly Effect

‘‘Long range detailed weather prediction is therefore impossible, …the accuracy of this prediction is subject to the condition that the flight of a grasshopper in Montana may turn a storm aside from Philadelphia to New York!’’

– W. S. Franklin (1898)


Argument for quantification
Argument for Quantification

“When you can measure what you are speaking about, and express it in numbers, you know something about it; but when you cannot measure it, when you cannot express it in numbers, your knowledge is of a meager and unsatisfactory kind…”

– William Thompson (Lord Kelvin), 1824-1907

If we can quantify complexity, we can

  • Determine whether one system is more or less complex than another

  • Determine whether a control (or C2) system is of the appropriate complexity for a given situation

  • Take appropriate steps to control complexity; e.g.,

    • Reduce the complexity of our environment

    • Increase the complexity of an adversary’s environment


Algorithmic information content
Algorithmic Information Content

  • Length of the shortest possible description of a system (made formal using Turing machine concept)

  • Pros:

    • Consistent with the idea that a good theory simplifies the description of phenomena

  • Cons:

    • Complexity may seem to be a property of our understanding of a system, not of the system itself

    • The length of description may depend on the vocabulary available

    • Relative complexity of two systems depends on the details of the Turing machine used

    • It is impossible to show that a description is the shortest possible

    • Random systems are maximally complex (counter-intuitive)


Computational complexity
Computational Complexity

  • The number of operations (typically multiplications) needed to solve a problem

  • Pros:

    • A complex problem takes longer (or more resources) to solve than a simple one

    • The difficulty of a complex problem grows rapidly with its size n:

      • Problems that can be solved in time proportional to nk are “polynomial time” problems

      • Problems that can be solved in time proportional to en or n! are “exponential time” problems

  • Cons:

    • There is no algorithm for determining how hard a problem is!


Formal complexity
Formal Complexity

  • Dynamic (process)

    • Corresponds roughly to computational complexity

    • Originated in non-linear dynamics

  • Architectural (structural)

    • Corresponds roughly to algorithmic information content

    • Originated in communication theory


Mandelbrot set
Mandelbrot Set

B. Mandelbrot, ca. 1978

A complex number c is a member of the set if

starting with z0=0, zn+1 = zn2 + c is bounded


Escape problems
Escape Problems

  • Mandelbrot set

    • A complex number c is a member of the set if

      starting with z0 = 0, zn+1 = zn2 + c is bounded

    • In other words, c is not a member if zn+1escapes

  • Sinai billiard

    • Y. Sanai, ca. 1963

      Made into an escape

      problem by Bleher

      et al. (1988)


Sinai billiard
Sinai Billiard

0

x0

x 105

x 106


Prediction horizon

Discontinuity in boundary conditions (as well as non-linearity) can cause divergent trajectories

Similar initial conditions produce similar trajectories for a limited time

Prediction Horizon


Differential games
Differential Games non-linearity) can cause divergent trajectories

  • Modeling conflict in a dynamical system (e.g., pursuit-evasion)

    • Each player (two or more) has a state-dependent utility function that he seeks to maximize

    • Each player has a set of control variables that influence the state of the system

    • What are the best strategies?

    • What are the possible outcomes?

  • Example: homicidal chauffeur problem (R. Isaacs, 1951)

    • The “pedestrian” is slow but highly maneuverable

    • The “vehicle” is much faster but far less maneuverable

    • Under what initial conditions (if any) can the pedestrian avoid being run over indefinitely?

  • Some games (complex ones?) generate state-space structures with fractal geometry


Control systems

Controller non-linearity) can cause divergent trajectories

System

Control Systems

Open Loop

Model

Goal

Closed Loop

+

Controller

System

Sensor


Control theory
Control Theory non-linearity) can cause divergent trajectories

  • Degrees of freedom (six for an aircraft)

    • (x,y,z) = coordinates of center of mass

    • (,,) = yaw, pitch, roll

  • Holonomicity

    • N degrees of freedom

    • Nccontrollable degrees of freedom

    • System is

      • Holonomic if Nc = N

      • Non-holonomic if Nc < N

      • Redundant if Nc > N

  • Aircraft (N=6, Nc=3,4) and automobiles (N=3, Nc=2) are non-holonomic

  • No stable control settings are possible; not every path can be followed

  • Every possible path can be approximated


Requisite variety and stability
Requisite Variety and Stability non-linearity) can cause divergent trajectories

  • Requisite variety (Ashby, 1958)

    • To control a system with Nc controllable degrees of freedom the control system itself must have at least Nc degrees of freedom

  • Given requisite variety in the control system for a holonomic system, stability is possible

    • Lyapunov stability: paths that start near an equilibrium point xe stay near xe forever

    • Asymptotic stability: paths that start near xe converge to xe

    • Exponential stability: the convergence is as fast as possible (Lyapunov exponent)


Internet 2001
Internet 2001 non-linearity) can cause divergent trajectories


Scale free network

k non-linearity) can cause divergent trajectories = degree (number of connections

Power law ( = -1.94)

Preferential growth and attachment

Diameter (max. distance between nodes) vs. fraction deleted

Failure = random node deleted

Attack = high-degree node deleted

E = random, SF = scale-free

Scale-Free Network

World-Wide Web

Failure and Attack Tolerance

Random

– A. Barabási, et al. (2000)


Fat tails
Fat Tails non-linearity) can cause divergent trajectories


Cellular automata
Cellular Automata non-linearity) can cause divergent trajectories

Game of Life

– J. Conway (1970)


Emergence
Emergence non-linearity) can cause divergent trajectories

  • Emergent objects belong to a higher level of representation than individual cells or their behavior rules

  • Levels (Game of Life):

    • Cells and rules

    • Objects (blinkers, gliders, blocks, beehives, etc.)

    • Interactions of objects (attraction/repulsion, annihilation, etc.)

    • Architectures of objects (guns, puffers, rakes, etc.)

  • Multiscale Representation (Y. Bar-Yam): each level of representation has its own:

    • Scale: number of entities or components

    • Variety: number of possible actions or states

  • Fundamental questions

    • How is behavior at each level determined?

    • Can constraints or behaviors at higher levels influence lower ones?

    • Is there “downward causation”?

    • Can we design for desired behaviors?


Gosper s glider gun
Gosper’s “Glider Gun” non-linearity) can cause divergent trajectories


Design and control
Design and Control non-linearity) can cause divergent trajectories

  • Systems can become complex either because or in spite of design rules

  • Simplicity is generally a goal, but it competes with other goals: efficiency, robustness, versatility, etc.

  • Systems generally evolve toward greater complexity, not less


Functional decomposition

Traditional engineering practice non-linearity) can cause divergent trajectories

Hierarchical structure

Independent modules

System/subsystem or system (family) of systems

Functional Decomposition


Commonality
Commonality non-linearity) can cause divergent trajectories


Reuse
Reuse non-linearity) can cause divergent trajectories


Big ball of mud
Big Ball of Mud non-linearity) can cause divergent trajectories

“A BIG BALL OF MUD is haphazardly structured, sprawling, sloppy, duct-tape and bailing wire, spaghetti code jungle… These systems show unmistakable signs of unregulated growth, and repeated, expedient repair.”

“…a complex system may be an accurate reflection of our immature understanding of a complex problem. The class of systems that we can build at all may be larger than the class of systems we can build elegantly, at least at first.”

– B. Foote and J. Yoder, in Pattern Languages of Program Design 4 (2000)


Highly optimized tolerance hot
Highly Optimized Tolerance (HOT) non-linearity) can cause divergent trajectories

“Our focus is onsystems which are optimized, either through natural selection or engineering design, to provide robust performance despite uncertain environments. We suggest that power laws in these systems are due to tradeoffsbetween yield, cost of resources, and tolerance to risks. These tradeoffs lead to highly optimized designs thatallow for occasional large events.”

“The characteristic features of HOT systemsinclude: (1) high efficiency, performance, and robustness to designed-for uncertainties; (2) hypersensitivity todesign flaws and unanticipated perturbations; (3) nongeneric, specialized, structured configurations; and (4) power laws.”

– J. M. Carlson and J. Doyle, Physical Review (1999)


Complexity and c2
Complexity and C2 non-linearity) can cause divergent trajectories

  • Complex systems analysis is not (yet) a revolutionary new paradigm

  • We can use the complexity mindset and toolbox to re-visit and re-assess C2 problems

    • Speed of command and the OODA loop

    • Complex endeavors

    • The DIME/PMESII construct

    • Wicked problems

    • The C2 Approach Space

    • Optimization

    • Rare events

    • Emergence and causality


Speed of command control

B non-linearity) can cause divergent trajectories

A

B

Control:

“Correct for cross winds”

Command:

“Fly from A to B

A

C

B

A

Command:

“Divert to C

Speed of Command/Control


Ooda loop vs control loop
OODA Loop vs. Control Loop non-linearity) can cause divergent trajectories

  • Traditionally: command is human, control technological

  • Modern control theory describes highly complex behaviors

  • Potential for application to command problems


Complex endeavors
Complex Endeavors non-linearity) can cause divergent trajectories

  • Complex endeavors have one or more of the following characteristics:

    • The number and diversity of participants is such that:

      • There are multiple interdependent “chains of command”

      • The objective functions of the participants conflict with one another or their components have significantly different weights

      • The participants’ perceptions of the situation differ in important ways

    • The effects space spans multiple domains and there is

      • A lack of understanding of networked cause and effect relationships

      • An inability to predict effects that are likely to arise from alternative courses of action

        – D. Alberts and R. Hayes, Planning: Complex Endeavors (2007)

  • Interpretation as differential games

    • Utility functions of coalitions (Uc = utility function of the coalition, Ui = utility function of member i )

    • Tight coalition: Uc is a fixed function of the individual Ui

    • Loose coalition: Uc is a function of the individual Ui that depends on the state of the system, allowing gain/loss of commitment, subversion, defection, etc.


Dime pmesii formalism
DIME/PMESII Formalism non-linearity) can cause divergent trajectories

  • State variables: Political, Military, Economic, Social, Information, Infrastructure

  • Control variables (interventions): Diplomatic, Information, Military, Economic

  • Questions:

    • Does DIME have requisite variety to control PMESII?

    • What happens when the game is two-sided? many-sided?


Competition

Recent study (AT&L/N81) indicates that available models do not capture essential features

The process by which PMESII state generates DIME interventions

The adversary response and resulting feedback loops

Competition

P

M

E

S

I

I

D

I

M

E

P

M

E

S

I

I

D

I

M

E


The invisible hand
The “Invisible Hand” not capture essential features

  • Adam Smith: market forces provide closed-loop control of the economy

  • Modern economists: are you kidding?

  • No reason to assume:

    • Requisite variety in control variables

    • Stable solutions or attractors in state space

  • Application of game theory:

    • “Rational actor” assumption limits choices of utility functions

    • Limited ability to deal with coalitions

  • Similar issues in other PMESII variables


Wicked problems
Wicked Problems not capture essential features

  • There is no definitive formulation of a wicked problem

  • Wicked problems have no stopping rule

  • Solutions to wicked problems are not true-or-false, but good-or-bad

  • There is no immediate and no ultimate test of a solution to a wicked problem

  • Every solution to a wicked problem is a "one-shot operation"; because there is no opportunity to learn by trial-and-error, every attempt counts significantly

  • Wicked problems do not have an enumerable (or an exhaustively describable) set of potential solutions, nor is there a well-described set of permissible operations that may be incorporated into the plan

  • Every wicked problem is essentially unique

  • Every wicked problem can be considered to be a symptom of another problem

  • The existence of a discrepancy representing a wicked problem can be explained in numerous ways. The choice of explanation determines the nature of the problem's resolution

  • The planner has no right to be wrong

    – H. Rittel and M. Webber, Policy Sciences (1973)


No evolution
No Evolution not capture essential features

  • There is no definitive formulation of a wicked problem

  • Wicked problems have no stopping rule

  • Solutions to wicked problems are not true-or-false, but good-or-bad

  • There is no immediate and no ultimate test of a solution to a wicked problem

  • Every solution to a wicked problem is a "one-shot operation"; because there is no opportunity to learn by trial-and-error, every attempt counts significantly

  • Wicked problems do not have an enumerable (or an exhaustively describable) set of potential solutions, nor is there a well-described set of permissible operations that may be incorporated into the plan

  • Every wicked problem is essentially unique

  • Every wicked problem can be considered to be a symptom of another problem

  • The existence of a discrepancy representing a wicked problem can be explained in numerous ways. The choice of explanation determines the nature of the problem's resolution

  • The planner has no right to be wrong


No design
No Design not capture essential features

  • There is no definitive formulation of a wicked problem

  • Wicked problems have no stopping rule

  • Solutions to wicked problems are not true-or-false, but good-or-bad

  • There is no immediate and no ultimate test of a solution to a wicked problem

  • Every solution to a wicked problem is a "one-shot operation"; because there is no opportunity to learn by trial-and-error, every attempt counts significantly

  • Wicked problems do not have an enumerable (or an exhaustively describable) set of potential solutions, nor is there a well-described set of permissible operations that may be incorporated into the plan

  • Every wicked problem is essentially unique

  • Every wicked problem can be considered to be a symptom of another problem

  • The existence of a discrepancy representing a wicked problem can be explained in numerous ways. The choice of explanation determines the nature of the problem's resolution

  • The planner has no right to be wrong


Complexity
Complexity not capture essential features

  • There is no definitive formulation of a wicked problem

  • Wicked problems have no stopping rule

  • Solutions to wicked problems are not true-or-false, but good-or-bad

  • There is no immediate and no ultimate test of a solution to a wicked problem

  • Every solution to a wicked problem is a "one-shot operation"; because there is no opportunity to learn by trial-and-error, every attempt counts significantly

  • Wicked problems do not have an enumerable (or an exhaustively describable) set of potential solutions, nor is there a well-described set of permissible operations that may be incorporated into the plan

  • Every wicked problem is essentially unique

  • Every wicked problem can be considered to be a symptom of another problem

  • The existence of a discrepancy representing a wicked problem can be explained in numerous ways. The choice of explanation determines the nature of the problem's resolution

  • The planner has no right to be wrong


Wicked complex or ill posed
Wicked, Complex, or Ill-Posed not capture essential features

“In reality the problems are not so much ‘wicked’ as complex.”

– E. Smith and M. Clemente, 14th ICCRTS (2009)

  • “Wicked” problems are best described as differential games

    • Multiple participants compete to maximize their individual utility functions

    • Most social policy problems (when described as games) probably are complex, but formal analysis is just starting in biology and economics

    • The Rittel-Webber description reflects a misguided attempt by the “planner” to define a single utility function (i.e., create a single, tight coalition)

    • “Wickedness” is not a property of the system but of how we have defined the problem


C2 approach space
C2 Approach Space not capture essential features

  • Three dimensions (D. Alberts and R. Hayes, 2007):

    • Patterns of interaction

    • Distribution of information

    • Distribution of decision rights

  • Incident response model (M. Bell, 14th ICCRTS)

    • Assumptions (decentralized C2)

      • Decision rights: widely distributed

      • Information: widely distributed

      • Interaction: highly limited

    • Results (agent-based simulation)

      • Effective “edge” organizations do not have to be near the high end of all three dimensions

      • Self-organization can occur with very simple behavior rules

      • Self-organization can be counter-productive

      • Iterative refinement of the rule set needed to exclude bad cases


Optimization
Optimization not capture essential features

  • Optimization of large, non-linear systems is almost always computationally hard (exponential time)

  • Heuristic approaches will sometimes give good approximate solutions

  • Robustness is an issue

    • Demonstrating stability (to small perturbations) may be computationally hard

    • Complex systems often have “brittle” optima

    • The probability of large perturbations may be greatly increased by non-linear dynamics

    • Extreme optimization (HOT) alters the distribution of properties or behaviors (fat tails)


Rare events
Rare Events not capture essential features

  • Not as rare as we might expect

    • Scale-free (self-similar) structures yield power-law distributions

    • Probabilities can be many orders of magnitude greater than predicted by the normal distribution

  • Distributions may not be stable (linear combinations of independent events do not have the same distribution as the events)

  • Joint probabilities may not be products of individual event probabilities

  • Increased probability of rare event sequences (cascading failures)


Causality
Causality not capture essential features

  • Complexity research deals with causal (deterministic) systems

  • Opposite of causal is random (not complex)

  • Complexity can:

    • Make it difficult to discover causal relationships

    • Limit prediction


Unintended consequences
Unintended Consequences not capture essential features

  • When we say that an outcome (or a side-effect) is “unintended,” do we merely mean that it is unanticipated?

  • If we could anticipate (predict) such an outcome or effect, would it necessarily become intended?

  • Does ethical or legal responsibility follow?

  • Can blame be assigned without evidence of predictability?


Conclusions
Conclusions not capture essential features

  • Complexity research has deep roots in several traditional scientific disciplines

  • It has advanced the state-of-the art in these fields and promoted cross-pollination among them

  • It has been a major enabler in the development of new sub-disciplines (e.g., social network analysis, non-linear dynamics)

  • It has not (yet) yielded a revolutionary new paradigm for scientific research

  • It offers significant potential benefits in C2 research

    • The mindset and toolbox can be exploited to advance OR and C2 research methodology

    • Discoveries in other disciplines can be translated into useful insights or partial solutions to C2 problems

  • It does not invalidate any previous work or challenge the goals of C2 research


Questions or comments
Questions or Comments? not capture essential features