1 / 58

Advanced Political Economy

Advanced Political Economy. New trends in Political Economy “Econophysics”. Why “econophysics”?. MANY reasons; but… Two stand out: “Because we’ve solved all the big problems in physics”

Download Presentation

Advanced Political Economy

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Advanced Political Economy New trends in Political Economy “Econophysics”

  2. Why “econophysics”? • MANY reasons; but… • Two stand out: • “Because we’ve solved all the big problems in physics” • Remark by physicist Cheng Zhang at 1st Econophysics Conference, Bali 2002, in response to question from economist Paul Ormerod • Because one dominant concept in modern physics is highly applicable to finance: uncertainty • Pre-Einsteinian physics based on uniformity & certainty • Newtonian “Laws of Motion” circa 1700s • Supplemented by Maxwell’s equations for electromagnetic phenomena circa 1800s • “La Place’s conceit”: “Give me the equations for the universe & I can predict not just the future but also the past”

  3. A Quick Physics Primer • By late 19th century, just two anomalies • Speed of light: • Light seen as “wave” • “Waves” presumed to move through medium • E.g., sound waves are cyclic compression/expansion of air molecules • Light thought to move through “aether” • Unobserved substance thought to permeate all space • Aether “fixed”, universe moves with respect to it • IF aether exists & Earth moving through it, THEN speed of light in one direction (forward into aether) should be slower than other (backwards with aether) • Michelson-Morley experiment: speed of light constant in all directions

  4. A Quick Physics Primer • “Black body radiation” problem • Atom known to exist • Model of atom was positive nucleus orbited by negative electrons • By Maxwell’s & Newton equations, orbiting charge should radiate energy • Electron should rapidly spiral into nucleus… • “Black bodies” (i.e. any object, not just heated ones) should radiate energy • Fitted to experimental data, model predicted EITHER infinite energy at low frequency OR infinite energy at high energy • Actual energy profile was a “hump” • Theory could fit one side or other but not both

  5. A Quick Physics Primer • Einstein/Planck solutions to dual problems • light comes in small discrete packets called quanta • Energy not continuous but discrete with minimum unit “Planck’s constant” • Probability & uncertainty became essential aspects of physics • Physics also accepted “Boltzmann’s Laws” after strong 19th century resistance • Progression of energy from highly ordered to disordered state; increase in “entropy” • All work involves generation of wasted energy: work (desired) necessitates heat undesired but unavoidable) • Combination of ideas develops measure of knowledge called Shannon’s entropy

  6. A Quick Physics Primer • Later refinements of Einstein-Planck physics • Deterministic general theory of relativity • Highly successful model of universe on large scale • Speed of light, relativistic mass effects, gravity bending of light • Probabilistic quantum theory of matter • Bizarre experimental outcomes • “Double slit” experiment • Photons etc. “interfere” with each other even when emitted singly • Dominant “Copenhagen” interpretation—observer affects outcome—but many others • Essential some form of uncertain “simultaneity” between quantum-entangled particles • Theorems/measurement derived from huge experimental base

  7. A Quick Physics Primer • Experiments involve massive “particle accelerators” • Electro-magnetic cylinders pushing particles in near vacuum to near light speed • Into collision with other particles • Massive sprays of “fundamental particles” (leptons, muons, bosons, quarks…) analysed by sensitive detectors • Heavy-duty statistical apparatus developed to cope with data (computer hardware & software, mathematical theorems…) • Many other areas of analysis opened up with computing (e.g., “Josephson junction” circuitry, “quantum tunnelling” circuitry, “quantum computing”…) but no breakthroughs • Physicists also develop “complexity theory” as explanation for large-scale phenomena (many standard deviations events) regularly seen in physical data (weather, earthquakes…)

  8. A Quick Physics Primer • Today’s unresolved boundaries • Conflict between relativity & quantum mechanics on scale of very small & very “new” • First microseconds of universe • Behavior of matter at black holes, etc. • Main theoretical development “string theory” • Matter as multi-dimensional vibrating “strings” • Standard models: universe 10-11 dimensional • Only 4 dimensions (space + time) visible to us • Tiny fraction of physicists now working on this at highly abstract level (but still with experimental-theoretical interplay • Experiments needed for string theory controversies prohibitively expensive…

  9. Enter econophysics… • Some physicists (e.g., Cheng Zhang, Joe McCauley, Tsallis) had innate curiousity about economics & social phenomena • Large numbers physics graduate students with little possibility of “experimental apprenticeship” • Huge body of “pure” financial data available for experimental analysis • Clear (and, to physicists, strange but not unfamiliar) signs of discord in economic/financial theory: • Planck on acceptance of his ideas in physics “An important scientific innovation rarely makes its way by gradually winning over and converting its opponents: it rarely happens that Saul becomes Paul. What does happen is that its opponents gradually die out, and that the growing generation is familiarised with the ideas from the beginning.” (M. Planck, in: G. Holton (Ed.), Thematic Origins of Scientific Thought, Harvard University Press, Cambridge, MA, 1973; in Scientific Autobiography, New York Philosophical Library, New York, 1949.)

  10. Enter econophysics… • A research paradigm develops: • “Why not apply tools of theoretical physics to large body of financial data & see what we find?” • Large number of regularities seen by physicists with respect to advanced physics that, from neoclassical economics point of view, were anomalies • Distributions of financial data follow Power / Zipf / Pareto Distributions • Standard characteristic of highly interacting nonlinear nonequilibrium processes • Versus neoclassical belief data should follow innately random distributions since markets assumed to be “rational”, rational defined as “all knowing”, system assumed stable… • Huge baggage of a priori assumptions at conflict with data…

  11. Enter econophysics… • Main areas of research • Statistical patterns in finance • Also income distribution, firm sizes, “extinction patterns” • Initially “chaos” (Mandlebrot etc.) but subsequently Power Laws, Zipf Laws, Pareto, Exponential, Levy & Gamma distributions; now “Tsallis’s q”: “nonextensive statistical mechanics” • “Parsimonious” models of financial market behaviour • “El Farol” model; Minority Game • Little work to date in alternative economic foundations • May come with time, and will probably be radically different to either neoclassical or classical foundations

  12. Statistical Patterns • Perspective of econophysicists very different to neoclassical economists (and other victims of equilibrium thinking): • “Statistical physicists, myself included, are extremely interested in fluctuations. In the field of economics, we find ourselves surrounded by fluctuations—were it not for economic fluctuations, economists would have no work to do.” Gene Stanley, (editor Physica A [journal of inter-disciplinary physics]) 2000, “Exotic statistical physics: Applications to biology, medicine, and economics”, Physica A 285: 1-17. • Versus “mechanisms that achieve equilibrium” focus of standard economic paradigm

  13. Statistical Patterns • “What do we do when we carry out research on economic fluctuations? Our approach has been to use our experience in critical phenomena research and assume that when we see fluctuations, correlations may be present.” (10) • A search for feedback effects between data rather than assumption of independence • Main finding: “events that are rare by 8 orders of magnitude—events that occur once in every 100 million trades—fall on the same curve as everyday events.” (12) • Subsumes results with all distribution types • Inspiration for main theoretical development “nonextensive statistical mechanics” • Commenced with Mandlebrot’s work on fractals in 1960s

  14. Mandelbrot, fractals, & chaos • Mandelbrot began research in economics into income distribution • Vilfredo Pareto in late 1800s noticed “Power Law” in income distribution • Number of people N earning more than x follows formula • log N = log A + m log x (A, m constants) • In 1961 by chance saw graph of cotton prices that mirrored data on income distribution • Noticed “scale invariance” as a feature of economic data; argued fundamental feature of financial data • BUT ignored in favour of “The New Finance” of Sharpe & CAPM! • Shifted into geography & geometry; now insights re-emerging as foundation of new approach to finance…

  15. Power (and other) “Laws” • Power Laws, Zipf Laws, Pareto Laws all relate to distributions in which elements in the system affect each other very strongly and nonlinearly • Resulting patterns appear random but are not • Compared to truly random data, have many more extreme events • But random processes can be generated by strongly chaotic processes! • Difference appears to lie in “mixing”: “random” processes achieve strong mixing of elements; “chaotic” processes lead to patterns of self-similarity, not uniformity • Area still very speculative but clearly on track to much more successful theory of finance than CAPM

  16. Non-extensive statistical mechanics • The basics: Entropy • Boltzmann-Gibbs statistical mechanics • Shannon information theory • Advanced: non-extensive statistical mechanics • After the hairy stuff, a quick survey of major trends in econophysics

  17. Entropy Warning! Warning! • Entropy is … Mind-bending material approaching! • Best introduced by jokes… • The 3 laws of thermodynamics are: • 1. You can't win. • 2. You can't even break even. • 3. You can't get out of the game. • A more informative version: • 1. You can't win, you can only break even. • 2. You can only break even at absolute zero. • 3. You can never reach absolute zero.

  18. Entropy: the 2nd law • Starting at the beginning… • Key concept in science in general is conservation law: some key entity is conserved through a series of transformations • In physics, it’s energy… • 1st law is “Law of Conservation of Energy”: Energy can change form but cannot be created or destroyed • Hence “You can’t win”… • Not a priori belief (like economics’ “law of one price” or other empirically false propositions) but expression of observed regularity • Form of energy can change but amount of energy in a system remains constant • Define overall energy as U and two transformations of it as Q (heat) & W (work)…

  19. Entropy • Then DU+Q+W=0 • Rule first developed in experiments with steam engines where focus was on inputting heat (burning coal) and getting out work (turning a shaft), so expressed as • DU=Q-W • Just a convention reflecting “heat in, work out” • Objective of engineers was to achieve maximum conversion of heat (Q) into work (W), but found waste heat always generated. • Puzzle became “why”? • Solved by imagining “ideal” device that converted all energy of system U into work W and then extending analysis to non-ideal systems (compare this to economics…)

  20. Entropy • Basic model piston moving weight… • If weight removed from piston, then gas would move piston to new location where pressure in gas balanced weight of piston… Weightat height H CylinderVolume V • What if weight consisted of many fine grains of sand & one was removed at a time? • How far could piston itself raise the sand? • How much work could the piston do on the sand? PistonArea A GasPressure P

  21. Entropy • Upwards force from compressed gas equals Pressure P times area of piston A… • At start of process, weight stationary & gas at pressure P • Forces must then be in balance • Force of gas (P.A) just equals downwards force of gravity on weight • If weight moves small dh distance then change in volume dV equals A times dh • Work done is integral of force over distance it operates where so that

  22. Entropy • In ideal cylinder (all energy converted into work), work equals integral of pressure with respect to volume • In 0% efficiency cylinder (all energy converted into heat), heat equals integral of temperature with respect to … something we’ll label S for now. • In between, the rule applies that • Change in energy equals heat plus work becomes • Change in energy = Temperature times change in Entropy + Pressure times change in Volume • (change in volume is work—useful expenditure of energy) • So how efficient can a working engine be?

  23. Entropy • Basic cycle of internal combustion engine is: • Piston at top of cylinder; pressure & temperature low • Call Volume V1, Pressure P1, Temperature T1 • Piston pushed by crankshaft; pressure increased • Temperature necessarily rises • Volume V2, Pressure P2, Temperature T2 • Gas ignited • Temperature rises dramatically • Volume V3=V2, Pressure P3=P2, Temperature T3 • Piston pushed back to starting position • Temperature falls, volume rises, pressure drops • Volume V4=V1, Pressure P4, Temperature T4 • Hot gases expelled • Return to V1, P1, T1

  24. Entropy • Perfect efficiency engine now assumed: no friction losses etc., all processes involve only changes in pressure or temperature, not both at once • Change in energy of perfect gas given by “heat capacity” times change in temperature; e.g. heat capacity = 5 units • Example temperatures of • T1=300K (Kelvin or temperature above absolute zero) • T2=400 • T3=1600 • T4=600 • Can now apply • where at each stage either dS=0 or dV=0 (perfect efficiency)

  25. Entropy • Stage 1: all compression, dS=0. So: • (“work in” so “negative work output”) • Stage 2: all temperature change, dV=0. So: • Stage 3: all expansion, dS=0. So: • Stage 4: all temperature change, dV=0. So: • Work sum is -500+5000=4500 • Energy input is 6000 • Ratio is efficiency of perfect engine: 4500/6000=75%

  26. Entropy • Actual engine has lower efficiency • Conversion of some compression into temperature, some rise in temperature into rise in volume • As well as the usual suspects friction, etc. • Typically achieve only half ideal ratio. • What’s the problem? • Truly ideal engine design reveals part of cause • Carnot (1824) imagined perfect heat-exchange engine • Found engine had to discharge heat to perform work • Efficiency function of discharge temperature level • Only if discharge temperature was absolute zero could engine be 100% efficient

  27. Entropy: Carnot engine • During initial expansion phase, gas in cylinder kept at constant temperature TH: heat QH must be added • During work expansion phase, temperature drops because volume expands; W extracted • During initial contraction phase, gas in cylinder kept at constant temperature TC: heat QC must be extracted • During final contraction phase, temperature rises because volume contracts

  28. Entropy: Carnot engine • Since engine repeats cycle, energy change over whole cycle zero: so Work extracted must equal sum of heat input & extraction: • Engine efficiency is ratio of work output to energy input: • There is a simple relationship between Q and T: • So energy efficiency can only be 100% of TC=0 Kelvin • Also explains why high temperature engines are more efficient…

  29. Entropy • So “something” in nature means that no work process can occur without generating waste heat. • That “something” is “2nd law of thermodynamics”: “entropy” increases where entropy is the S in • General statement: “For any process by which a thermodynamic system is in interaction with the environment, the total change of entropy of system and environment can almost never be negative. If only reversible processes occur, the total change of entropy is zero; if irreversible processes occur as well, then it is positive.” • “S” taken as measure of disorder of system since related by Boltzmann & Gibbs to the number of distinguishable states W that a system can be in by

  30. Entropy • Boltzmann’s formula linked to structure of matter by concept of microstates • Overall state (temperature, pressure, etc) of given system reflects ensemble of states of constituents (atoms, molecules, etc.) • State of constituents reflects • How many ways constituents can be organised • Number of constituents having each possible state • E.g., consider placing colour squares on 4x4 grid • Say 1st square is red; can be placed in any of 16 locations 1 • Next e.g. blue; placed in any of 15… • 16!=20,920,000,000,000,000 possible combinations!…

  31. Entropy • But say there is 1 red, 3 green, 5 blue, 7 orange squares in ensemble. Then • are different combinations but can’t be distinguished from each other and • Ditto for other arrangements of other colours (numbers there just to show difference) • To compensate, have to divide 16! by product of all possible ways of achieving identical microstates: • Divide by 1! x 3! x 5! x 7!=3,628,800, leaving 5,765,760 distinct arrangements • General formula is

  32. Entropy • Can also be put as • Where W is number of discrete microstates system can be in; and pi is probability of the ith such state • Entropy as defined here applies to “ergodic” systems: • “dynamics whose time averages coincide with ensemble averages” (Tsallis et al. 2003, “Nonextensive statistical mechanics and economics”, Physica A 324: 89-100) • Colloquially, systems that converge to or orbit long run equilibrium values; that over time fill the entire phase space

  33. Entropy • Consider our 4x4 grid • Imagine these represent entities in a dynamic system • E.g., gas molecules in a tiny container • Odds of squares being in highly ordered initial state (all similar colours next to each other) very low… • Many more ways for squares to be in “more disordered” arrangement than one where all colours are mixed up • Over time, each square will spend 1/16th of its time in each of 16 possible positions (“time averages coincide with ensemble averages”) • But far from all (physical or social) systems have this characteristic…

  34. Entropy • For example, Lorenz’s model: • Complex dynamics means time average very different to average over phase space because system never goes near equilibria…

  35. Entropy • Problem of failure of deep, established concept like Boltzmann-Gibbs entropy to characterise many real world systems troubled physicists, statisticians • Ironically, CAPM analysis of derivatives (Black-Scholes) related to this area • Many alternative characterisations proposed • Power Laws • Hurst exponents • … • Best to date is revised version of Boltzmann-Gibbs entropy suggested by Tsallis in 1985 • Sheer intuition—not derived but guessed at. • Interesting example of how scientific advance can occur. In his words…

  36. Nonextensive Statistical Mechanics… • “A Mexican–French–Brazilian workshop entitled “First Workshop in Statistical Mechanics” was held in Mexico City, during 2–13 September 1985… That was the time of fashionable multifractals and related matters. During one of the coffee breaks, everybody went out from the lecture room, excepting Brezin, a Mexican student …, and myself… Brezin was explaining something to the student. At a certain moment, he addressed some point presumably related to multifractals—from my seat I could not hear their conversation, but I could see the equations Brezin was writing. He was using pq, and it suddenly came to my mind—like a flash and without further intention—that, with powers of probabilities, one could generalize standard statistical mechanics, by generalizing the BG entropy itself and then following Gibbs’ path. Back to Rio de Janeiro, I wrote on a single shot the expression for the generalized entropy, namely… Tsallis 2004: 727

  37. Nonextensive Statistical Mechanics… • Why does it matter? • q = 1 returns standard distributions • q > 1 “privileges common events” • Common (near mean events) occur more frequently than for Gaussian/standard entropy distributions; and… • “rare events will lead to large fluctuations, whereas more common events will result in more moderate fluctuations.” • “A concrete consequence of this is that the BG formalism yields exponential equilibrium distributions (and time behavior of typical relaxation functions), whereas nonextensive statistics yields (asymptotic) power-law distributions” (Tsallis et al. 2003: 91) • Tsallis’s “q” may capture interactive instability of finance markets. Tsallis distributions fit finance data accurately with q≈1.4

  38. Nonextensive Statistical Mechanics… • E.g., Stock market returns for top ten stocks on NYSE: “Dotted line is the Gaussian distribution; … 2- and 3-min curves are moved vertically for display purposes…”; Far better fit to data than CAPM models… • Many other areas where “Tsallis’s q” enables accurate fit to data whereas standard extensive statistics models (Black-Scholes, CAPM, EMH etc.) do not…

  39. Econophysics • Tsallis’s analysis may become foundation of all other statistical analysis by econophysicists • In meantime, many other areas where skills & technologies of physicists are being applied. E.g.: • Sornette’s analysis of asset bubbles and bursts • Minority Game “parsimonious model” of finance markets • Scarfetta’s analysis of income distribution • Ponzi’s model of multi-sectoral dynamics • Many others can be found at: • http://www.unifr.ch/econophysics/

  40. Why Stock Markets Crash • Sornette geophysicist • Study of earth’s dynamics • Developed theory of earthquakes as extension of Per Bak’s theory of “self-organised criticality” • Classic model the sand pile • Pour sand onto surface one grain at a time • For a while, pyramidal shape forms • Slope of pyramid gets steeper • Slope then collapses in “avalanche” • One grain of sand causes more than one to fall in a chain reaction • Collapse of pyramid reduces slope below critical level • Pyramid reforms… process repeats

  41. Why Stock Markets Crash • Sornette’s earthquake model similar with tectonic plates as the grains of sand, motion of earth’s core as pouring force • Movement of molten core causes plates to move on surface • Increasing tension between plates causing vibrations that increased over time • Release in large scale earthquake • Decreasing tension between plates over time… process repeats • Pattern captured by “log periodic” function • Applied to stock market where “collective interactions between agents leading to a cascade of amplifications” replace movement of plates

  42. Why Stock Markets Crash • Basic function for change of index is of the form • Predicts increasing frequency of fluctuations as critical time approaches • Problem is to identify critical time!

  43. Why Stock Markets Crash • On other side of crash, critical time known • Curiosity now is does crash fit log-periodic form? • Graph fits US S&P500 to function • Problems develop when extended further in time • Tectonic plate dynamics don’t change on human time scale; finance markets & economies do… • But clear relevance of log periodic form to short-term market movements before/after crash

  44. The Minority Game • Minority Game development begun by economist Brian Arthur • Model was “El Farol” Irish (yes, Irish!) pub in Santa Fe • Popular after-hours venue; but only pleasant when “neither empty nor full” • Problem: how to predict whether worth attending a given night? • Arthur’s model: 100 Irish music fans in Santa Fe; bar only enjoyable when less than 40 attend a night • Fans decide whether or not to attend based on various strategies • A “minority game”: you win by being in the minority • Therefore no equilibrium: any winning strategy will break down as other agents adopt it

  45. The Minority Game • Extended by Yi-Cheng Zhang & others to “Minority Game” • Artificial stock market in which winning strategy is to sell when majority is buying, buy when majority selling • Realisation that MG isn’t a complete model: • “In financial trading, often it is convenient to join the majority trend, not to fight against the trend. During the Internet stock follies, it was possible to reap considerable profits by going along with the explosive boom, provided one got off the trend in time. There are many other situations where success is associated with conforming with the majority.” • But proposition that might still be “on the money” because…

  46. The Minority Game • “majority situations may actually have minority elements embedded in them. The real financial trading probably requires a mixed minority-majority strategy, in which timing is essential. The minority situations seem to prevail in the long run because speculators cannot all be winners. Indeed no boom is without end, being different from the crowd at the right time is the key to success. In a booming trend, it is the minority of those who get off first who win, the others lose.” (Damien Challet, Matteo Marsili, Yi-cheng Zhang 2003, Minority Games, Oxford University Press, Oxford, 12-13)

  47. A two-part income distribution model • Basic Econophysics model a Power Law • Implies increasing concentration all the way up • Actual empirics of US data suggested a tail (low income) that didn’t fit Power Law • “An empirical distribution of wealth shows an abrupt change between the low–medium range, that may be fitted by a non-monotonic function with an exponential-like tail such as a gamma distribution, and the high wealth range, that is well fitted by a Pareto or inverse power-law function.” (Nicola Scafetta, Sergio Picozzi and Bruce J West, 2004, “An out-of-equilibrium model of the distributions of wealth”, Quantitative Finance 4: 353)

  48. A two-part income distribution model • Scarfetta et al suggest • Top end (rich) due to investment • Power Law wealth distribution generates matching income one • Bottom end (poor) due to trade which is biased in favour of poor: • Hard to explain from neoclassical foundation: • “Neoclassic economists do not expect trade to involve a transfer of wealth, but rather an increase in utility for both parties with a zero net transfer of wealth.”

  49. A two-part income distribution model • Scarfetta et al. propose: • “in trades there may be a transfer of wealth from one agent to the other because the price paid fluctuates around an equilibrium price (= value) and, therefore, the price may differ from the value of the commodity transferred; • (b) in a trade transaction the amount of wealth that may move from one agent to the other is bounded because the price and the value of a commodity cannot (usually) exceed the wealth of the poorer of the two traders; • (c) the price is socially determined in such a way that the trade is statistically biased in favour of the poorer trader.”

  50. A two-part income distribution model • In fact results validate my interpretation of Marx on value: • “Marx spoke of the relationship between the wage and the value of labour-power, he used the term ‘minimum wage’, that is, a subsistence payment [13–15], thus emphasizing that in practice he expected the wage to exceed this minimum and hence there to be a price-value divergence in favour of the working class at the expense of capitalists. These effects can be incorporated into the social equality index f of equation (14) that measures the statistical bias of the trade in favour of the poor.”

More Related