1 / 50

Systems Thinking

Systems Thinking. “Systems thinking is thinking in loops rather than in straight lines.” (O’Conner & McDermott, 1997)  . A causes B causes C. Lines. A. B. Loops. C. Systems have emergent properties that are not observed by examining parts of system.   .

brone
Download Presentation

Systems Thinking

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Systems Thinking

  2. “Systems thinking is thinking in loops rather than in straight lines.” (O’Conner & McDermott, 1997)  

  3. A causes B causes C Lines A B Loops C

  4. Systems have emergent properties that are not observed by examining parts of system.   • Speed, efficiency, fairness, effectiveness, etc.

  5. Feedback Loop • The return of information about the status of a process. • It is the output of a process re-entering into the process as an input to influence the next step. • This is a cause and effect relationship such that the cause is in the past and effect is in the present.

  6. Balancing Feedback Loop • As a process changes, the return of information that opposes the original change • This acts to stabilize a system toward its goal. • These loops provide system equilibrium and limit change generated by reinforcing loops. • Also known as negative feedback loops.

  7. Balancing Feedback Loop

  8. Balancing Feedback Loop Infectious Disease Example • A virus so virulent and fast acting that the host population is killed off before transmission can be made reduces the widespread transmission of the most potent viruses.

  9. Balancing Feedback Loop Healthcare Insurance Example • Co-payments and deductibles may be viewed as a balancing feedback loop because they reduce the unrestricted use of free services, which then lowers the rate of increase in service capacity and system costs.

  10. Causal Loop Diagram • A diagram that captures how variables in a system are connected by cause and effect linkages. • These diagrams show all feedback loops. • Variables, arrows and plus or minus signs

  11. Causal Loop Diagram Infectious Disease Example • A diagram that shows how behaviors and environmental factors are connected to produce an epidemic.

  12. Causal Loop Diagram Healthcare Insurance Example • A diagram that shows the relationships between individual behavior and institutional actions that drive the cost of health care or the number of uninsured.

  13. Causal Loop Diagram

  14. Complexity • Having many different connecting parts (detail complexity), Variables • or having a great number of possible connections between parts (dynamic complexity), Arrows • or multiple, simultaneous feedback loops, where small variations may make a large difference (inherent complexity) Loops

  15. Complexity Infectious Disease Example • An epidemic with large numbers of infected individuals may exhibit detail complexity. Interactions of the pathogen within the host will exhibit dynamic complexity.

  16. Complexity Healthcare Insurance Example • The number of uninsured individuals exhibits detail complexity but the interaction of policies, patient incentives, 50 state jurisdictions, federal government programs, many 3rd party payers, and professional providers exhibit dynamic complexity.

  17. Stock • A quantity that accumulates or dwindles over time. • Examples include savings in a bank account, water in a tub, population in a country. • Can be visualized as a pool or reservoir of the quantity.

  18. Stock Infectious Disease Example • The number of infected individuals at any given time. This quantity will change as more people are infected and as people either recover or die. (ex: prevalence)

  19. Stock Healthcare Insurance Example • The number of uninsured at any given time. This quantity will change as more people take jobs that have health insurance and as people lose their insurance or die.

  20. Flow • The amount of change something undergoes over time. • Examples include the amount of interest earned in a savings account over a year, the amount of water flowing into and out of a bathtub in an hour, or the number of births and deaths in a population over a year. • Flows lead to changes in a stock (level). • Also known as rates.

  21. Flow Infectious Disease Example • The rate at which new cases of infection arise in a population (incidence or attack rate)

  22. Flow Healthcare Insurance Example • The rate at which individuals gain or lose insurance coverage.

  23. Stock and Flow

  24. Gap • The difference between a desired level (goal) and the actual level. Used to trigger a feedback.

  25. Gap • The gap between a viral reproduction rate (RO) of 1.0 and the actual rate along with a short generation time will indicate the speed of an epidemic. Closing the gap on the RO is one way to measure success in controlling the spread of an epidemic.

  26. Gap Healthcare Insurance Example • The gap between the deductible and the actual to date claims may trigger avoidance behavior for minor aliments.

  27. Goal • A desired state.

  28. Goal Infectious Disease Example • A state such that a pathogen is no longer being transmitted.

  29. Goal Healthcare Insurance Example • A state such that every one who wants health insurance is covered.

  30. Leverage Point • An area within a complex system where a small shift in one thing can produce big changes in everything. • “Leverage points are points of power.” (Meadows, 1999)

  31. Leverage Point Infectious Disease Example • A leverage point might be to require exposed populations to wash their hands frequently. A simple behavioral change like this might have a substantial impact on the transmission of a pathogen. 

  32. Leverage Point Healthcare Insurance Example • A leverage point might be changing a deductible from $100 to $200. A minor change like this may have huge impacts on the overall cost to the 3rd party payer.

  33. Mental Models • The ideas and belief systems that we use to guide behavior. They are used to explain cause and effect and to give meaning to experience.

  34. Mental Models • Many people have mental models not consistent with a scientific understanding of a disease. • Gay men get HIV. I am a middle-aged heterosexual woman; I won’t get HIV.

  35. Mental Models • Economists have mental models about economic behavior that lead to such concepts as moral hazard and adverse selection. • If I’m not paying for a hospital bill out of pocket, it doesn’t really cost me anything.

  36. Model • A schematic description of a system, theory, or phenomenon that accounts for its known or inferred properties and may be used for further study of its properties.

  37. Model Infectious Disease Example • Prevalence models that separate the host population into, for example, susceptible, latent, infectious and immune individuals.

  38. Model Healthcare Insurance Example • An economic theory of behavior in risky situations that identifies 3 types of behavior: risk avoidance, risk seeking, and risk neutral.

  39. 10 Useful Ideas on Systems Thinking Richard Wilkinson4/30/2001

  40. 1. Everything is connected to everything else. • Real life is lived in a complex world system where all the subsystems overlap and affect each other. The common mistake is to deal with one subsystem in isolation, as if it didn't connect with anything else. This almost always backfires as other subsystems respond in unanticipated ways.

  41. 2. You can never do just one thing. • This follows from the preceding idea. In addition to the immediate effect of an action, there will always be other consequences of it that ripple through the system. Every action has unintended consequences.

  42. 3. Different people in the same structure will produce similar results. • "Who has the most influence on the performance of an ocean liner when it is out at sea in route to its destination?“ • Answer: The designer of the ship. • Don't try to control the players, just change the rules. If the system tries to make choices for people, the people will try to outwit the system. It is much more effective to change the rules of the game so that it is to most people's advantage to make choices that are good for the whole system.

  43. 4. A collection of things is a system if any one element can affect the performance of the whole. • For example, it has been observed that business is part of a larger system constructively understood as such. • Business decisions affect the economy, environment, community, and industry, as well as the mental health and well-being of employees and their families, and the wealth of investors.

  44. 5. From "either/or" to "both/and". • We often err when we think in mutually exclusive opposites. We consider our next steps as being either along the path of solution x or solution y. • Breakthroughs come when we consider the possibilities of blending both x and y. • Considering both the whole and its parts, bridging in some lively way what appear to be opposites, forces us to consider situations from multiple perspectives.

  45. 6. There is no "away" to throw things to. • “People should not have to pay for health care. We should get rid of health care fees.” • “We should forbid the use of all toxic chemicals.” • “Vaccines may cause harm. They should be banned.”

  46. 7. The easiest way out is the fastest way back in. • A common blunder is to grab for a solution prematurely without appreciating the underlying root causes driving a situation. • A systems thinking sequence to reach a deeper understanding is to first consider the event, then to peel back a layer to see if it is part of an underlying pattern. In other words, has this happened before? Peel another layer by asking why this pattern is occurring. • Continue asking "And, why is that?" until the root cause emerges. [This is the practice of the asking the "5 whys".]

  47. 8. Profound changes can take place in ways we cannot foretell. • A small force or event can have a disproportionate effect.

  48. 9. The map is not the territory. • Useful as they are, no model, theory, or tool can capture the full complexity of the subject it addresses.

  49. 10. An answer is a question's way of asking a new question. • And, there are no final answers.

More Related