1 / 40

Some systemic concepts

Some systemic concepts. From http://artsci-ccwin.concordia.ca/edtech/ETEC606/systems.html. System Holism Example. A system as a whole works differently than the parts of the system. The parts alone cannot do what the system can.

aneko
Download Presentation

Some systemic concepts

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Some systemic concepts From http://artsci-ccwin.concordia.ca/edtech/ETEC606/systems.html

  2. System Holism Example • A system as a whole works differently than the parts of the system. The parts alone cannot do what the system can. • Examples: A motor without a car; a film with no story; a body without a heart; individual alone versus the same individual in mass (forum incident); individual in a collaborative versus individual in competetive environment.

  3. System Holism Principle • Principle: Therefore, it is necessary for a system to have functional parts that communicate efficiently. • Related concepts:Differentiation: Each part has an expert function.Synergy: The sum of the parts are greater than the parts added together (2+2=5).

  4. Complementarity Law • Differing perspectives on the same system are neither 100% independent nor 100% compatible; yet together they reveal more truths about the system than either could alone.

  5. Complementarity Law, example • Example: Three blind men and the elephant: the first felt the trunk, the second felt the legs, and the third felt the ears. Separately, each of them was building a different tale from the others based on his or her own partial experience. Yet, when they put the three tales together, they would build a better and truer picture.

  6. Ubiquity and unification principle • Certain mechanisms and laws hold for many different kinds of systems studied in biology, psychology, or engineering.

  7. Communication and Information 1 • Communication is when patterns in one system influence those in another later in time and usually at some distance. The patterns may be information (= that which reduces a receiver's uncertainty), and the information may be meaningful (= that which contributes (+/-) to the ability &/or desire of a system to function).

  8. Communication and Information 1 • The influence may be one way (e.g. reading an old book), or it may be reciprocal as in a live conversation. Actually, over geological time a whole series of levels of modulation, communication, information, and meaning (and the entities which exchange them) have emerged through Darwinian, and recently Lamarkian, evolution.

  9. Entropy (2nd Law of Thermodynamics) • Definition: Entropy is the ratio of available energy over total amount of energy in any closed system over a finite period of time. Any open system is closed at some level in the hierarchy of existence. Entropy always increases at a macro level. On a micro level, entropy can (temporarily) be decreased. It is a very important concept in Shannon's information theory.

  10. Entropy (Clarification) • The total amount of energy is constant in a closed system, but the amount which is available decreases as various mechanisms make it unavailable. • Examples: A car (or your body), as a closed system, can be kept "alive" longer through "maintenance" (increase order). However, over time, it still tends towards chaos, that is, the loss of available energy. Although the car is still there physically, it becomes a useless pile of rust. In a life span, the tremendous growth as a baby continously slows down.

  11. Entropy (Implications) • Therefore, to counteract the loss of available energy/mass /work (also called, exergy, essergy, hekergy), a system needs to be open to the environment to promote survival, growth, learning, etc. For example, by teaching a student a learning strategy ("guidance"), you increase order locally, and the student can survive better in the school, that is his/her capability increases.

  12. Redundancy of Information Theorem • Redundancy of Information Theorem is also referred to as Shannon's Second Theorem. According to the theorem, to maximize the efficiency of transmission over a noise channel, it is necessary to repeat the message a sufficient number of times to ensure reliable reception but not so many times that the rate of information transmission becomes unacceptably low.

  13. Redundancy(Clarification) • The theorem refers to the necessity of giving the same information in many different ways to increase the possibility of correct transmission, even though increase in channel capacity is necessary and more costly.Implication for instruction: In the long run, it is more efficient to repeat instructions, although the initial cost in time and money might seem high.

  14. Redundancy(Example) • For example, to help students to learn a new concept, it is more effective to use both graphical and textual explanations in many different ways. Although using the dual mode may seem redundant, it is more effective in getting ideas across and in facilitating students' construction of knowledge. Another example of the good use of redundancy is the well-known strategy in giving effective speeches: tell what you are going to say, say it, and finish by summarizing what you have said.

  15. Communication system • Signal to noise ratio is an important concept in Shannon and Weaver's information theory. According to Shannon and Weaver, a communication system consists of five essential elements: an information source, an encoderor transmissor, a channel, a detector, and a decoder. The transmission of informatin depends, to a large extent, on the signal to noise ratio.

  16. Signal to noise ratio principle • Signal refers to the wanted information; noise is whateverprevents it from being received. If the noise is louder than the wanted signal then there is a bad signal to noise ratio; if the signal is much louder than the noise there is a good signal/noise ratio. Signal to noise ratio is usually measured in decibels.

  17. Emergent Cybersystemic Levels of Communication and Control • Over thousands millennia viruses, living-organisms people and societies have evolved as SELF-PRODUCING systems. To carry out this self production and reproduction they need protected blocks of information (genes, memes, memeplexes) to tell them what to do when - that is to control their repair mating reproduction and survival. These special identity sustaining and conjugating messages are stored chemically or electrically or mechanically. They have evolved in a natural levels system or hierarchy.

  18. Levels • For our purposes it is most useful to identify about TEN levels. Patterned mass and/or energy is the basis for all the other higher levels of communicontrol. Next up is Shannon type "information" and the sorts of automata with storage (memory) which use it. Then above that are about eight other important emergent levels of systems each with its own characteristic kind of information, but in every case depending on the kinds of information at the lower levels and the mass-energy carriers at bottom. For more details see papers by: M. Bunge, John Von Neumann, and G. Boyd (among others).

  19. Level 10 CONTROL ENTITIES LEVELS OF INFORMATION X LIFE -(hypothetical) Conscious World (Cassirer, Teilhard de Chardin) Being (?Gaia come to responsible consciousness say, or something like that!). Which emerges from all of us. X EXISTENTIAL INFORMATION (Reduces receivers' uncertainty about the affiliative meaning, unity and futurity of Life (Hope or Despair engendering messages

  20. Level 9 CONTROL ENTITIES LEVELS OF INFORMATION IX SCIENCE-PHILOSOPHY COMMUNITY - The world-wide science community(which has greatest systemic integrity/ logical coherence in the physical sciences and but much less in education, or even in systems science!) IX SCIENTISOPHIC, CREATIVE & CRITICAL-REALIST INFORMATION (heuristic, integrating, falsificating) Reduces a receivers' uncertainty about the nature of the physical and about the artificial socially constructed) universes

  21. Level 8 CONTROL ENTITIES LEVELS OF INFORMATION VIII EDUTHERAPEUTIC TRANSVIDUALS - Systems of: Selves-actualizing (Maslow, 1971), emancipatively-conversing (Habermas), rational dutifully-free(Kant) ACTORS. VIII EMANCIPATIVE INFORMATION Reduces receivers’ uncertainty about possibility, and duty, and increases freedom with respect to actualizable identity (breaks limiting robotic learning habits and mind-marriages, and enables grieving, forgetting).

  22. Level 7 CONTROL ENTITIES LEVELS OF INFORMATION VII HUMAN IDENTITY CO&PRO-CREATORS - culturally-polysexually procreative & propagative ACTORS: (parents of psychosocially human children and ‘parents’ of highly valuable-viable ideas(“meme-plex” producers). VII CONJUGATIVE INFORMATION Reduces receivers' uncertainty as to what to incorporate into its identity for procreative purposes (- about which mind-marriages to make, and what to incorporate, protect & propagate).

  23. Level 6 CONTROL ENTITIES LEVELS OF INFORMATION VI COMPETITOR AUTOMATA - Game-playing actor-automata (Von Neumann, Rapoport), -including necessarily-competitive deprived-individuals, and econopolitical, sociotechnical systems. VI NEGOTIATIVE INFORMATION Rreduces receivers' uncertainty as to how to play in any (at least partly) formalized (e.g.. threat-promise, need/contribution bartering) game system).

  24. Level 5 CONTROL ENTITIES LEVELS OF INFORMATION V MEME-PROPAGATING AUTOMATA; - Acquirers & Replicators, Symbiocytic/parasitic A-Life systems, some animals & birds qua meme-hosts, (memplex addicts & pushers). V VIRAL INFORMATION (e.g. Dawkin’s MEMES, Hans Cees Speel’s(1997) meme-complexes etc.) seductive aesthetic forms - reduces receivers'uncertainty as to whether to replicate and broadcast themor not

  25. Level 4 CONTROL ENTITIES LEVELS OF INFORMATION IV SUBSISTING AUTOPOIETIC AUTOMATA (foraging and feeding, self-repairing, growing, survivors -in suitable environments- systems) IV SUBSISTANTIAL INFORMATION reduces a subsisting automata receiver’s uncertainty as to what to do next to go on subsisting (obtaining available-energy, food, clothing, shelter etc.) in the given environment(Devlin, 1991).

  26. Level 3 CONTROL ENTITIES LEVELS OF INFORMATION III LEARNINGAUTOMATA (- simple Turing machines i.e. machines with: input, memory, and learning algorithm programs). III UNCERTAINTY-REDUCTION INFORMATION Shannon(objective) and Weltner(subjective)-type information which reduces a receiver’s uncertainty with respect to an ensemble of possible received messages (Shannon, 1948, Weltner, 1974)

  27. Level 2 CONTROL ENTITIES LEVELS OF INFORMATION II SIMPLE AUTOMATA (without memory/storage) including single level feedback control systems, as well as “simple machines”. II EFFECTUAL COUPLING PROCESSES (modulated mass/energy) which changes the state of some sub-system in the future and usually at some distance

  28. Level 1 CONTROL ENTITIES LEVELS OF INFORMATION LEVEL I - CARRIERS MASS: (paper, CD, air, DNA molecules, etc.) ENERGY: (Light, Radiowaves, electricity)

  29. Feedback loops occur whenever part of an output of some system is connected back into one of its inputs. Depending on whether the connection is such as to add the output to an input ("positive" feedback) or such as to subtract the output from some input ("negative" feedback), the whole system will behave entirely differently . If energy is available, adding some of the output to the input causes a system to grow possibly explosively (tornadoes, rabbits), &/or to spiral down out of existence (bubonic plague). Subtracting some of an output from an input can lead to nearly stable (gently fluctuating) behaviour (common home thermostat). Feed-back loops

  30. For a system to function properly and survive, all the essential variables must be present and maintained within their ranges of variation.Example: Hypothemmia. Certain deviations are okay, but others are not. For example, the body temperature can range between 36%C to 37.5% C. Otherwise, one would be sick. Homeostastis Principle

  31. Steady State Principle • For a system to be in equilibrium, all its parts (subsystems) must be in equilibrium. If a system is in equilibrium, then its parts are in equilibrium. (Not necessarily a positive state of equilibrium. For example, some states are run on totalitarian)Implication: Surveillance systems and control systems are necessary to maintain a system in equilibrium. The principle prompts one to ask questions about reversible loops causing the system to stand still, not change,and not grow.Example: Battered women/child syndrome. The woman is too scared to tell, so the husband can keep on beating. The man can beat because the woman is too scared to tell. The Learning Robot memorizes content to pass a test, and then forget.

  32. A system can only be stable, if it is allowed enough time between disturbances to recover and go back to "normal" steady state. The disturbance to time determines whether the system can possibly maintain internal stability.Implication: To maintain smooth functioning of a system, it has to foresee disturbances and shocks to the system.Example: In HIV positive people, the body does not have enough time to recover. Burnout is caused by people working and worrying too much. Cutbacks and inflation in a factory lead to unemployment. Relaxation Time Principle

  33. Also referred to as positive feedback in cybernetics. In the presence of positive feedback, it is possible to achieve an end state that is radically different from the initial one.Example: Glorious advertisement to sell cigarettes, without prompting the possibility of getting lung cancer. To provide a discovery learning lab in chemistry, without providing safety rules. Circular Causality Principle I

  34. Also referred to as negative feedback in cybernetics. A negative feedback helps the system to maintain equilibrium and stability.Implication: Guiding and managing is necessary to obtain a goal without the system "blowing up". The principle prompts corrective prevention or planning techniques.Examples: Prevention techniques foresee change. Tell the teenager that life is longer than tomorrow, and it is not necessary to stay out until 2 am., thus curfew at 11:30. If a student is too talkative in class, tell to talk less; if too shy, prompt to speak up. Circular Causality Principle II

  35. An automation is a system which obtains, transforms, transmits and uses information to perform its functions without direct human participation. It is self-operational. (Lerner, 1972). The most general form of deterministic automata is the Turing machine. The notions of message, amount of disturbance or 'noise', quantity of information, coding technique, etc. in the study of automata are related to cybernetics. Automata Theory

  36. Deterministic automata is a concept of automata theory. The output is uniquely determined by the input sequences; that is, you get a definite output given any input. The behavior of such automata can be accurately predicted if the transfer operator is known and given in the form of a table of a logical function, and if you also know the initial state and the input sequence.To see an example of deterministic automata, please jump to this web site Lifegame created by Paul Chomsky and Robert Gordon ("http://www.article19.com/shockwave/lifegame.htm") Deterministic automata

  37. Statistical automata is a concept of Automata theory. Statistical (probabilistic or stochastic) automata - random output sequences are generated given any fixed input. The amount of randomness can be set by applying a probability of any output given the systems current state and input sequence. Statistical automata

  38. Memoryless automata, a concept in Automata theory, recognizes only one input at a time and produces output based on that input. The output is not influenced by any additional inputs which arrived before that one. The reaction time of the automaton is constant for all input signals. The internal state of such an automaton is independent from any external action. Memoryless automata

  39. Finite memory automata is a concept in Automata theory. It refers to the type of automata where the group of output signals generated at a given quantized time depends not only on the signals applied at the same moment, but also on those which arrived earlier. These preceding external actions (or fragments of them) are recorded in the automaton by a variation of its internal state. The reaction of such an automaton is uniquely determined by the group of input signals which has arrived and by its internal state at a given time. These factors also determine the state into which the automation goes. Finite memory automata

  40. Infinite memory automata is a concept in Automata theory. It refers to an abstract circuit of a logical automation which in principle is suitable for realizing any information processing algorithm. Turing macine belongs to this type of automata. Infinite memory automata

More Related