1 / 42

Physical Limits of Computing A Brief Introduction

Physical Limits of Computing A Brief Introduction. Dr. Michael P. Frank mpf@cise.ufl.edu Dept. of Computer & Information Science & Engineering (Affil. Dept. of Electrical & Computer Engineering) University of Florida, Gainesville, Florida. Presented at:

munin
Download Presentation

Physical Limits of Computing A Brief Introduction

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Physical Limits of ComputingA Brief Introduction Dr. Michael P. Frankmpf@cise.ufl.eduDept. of Computer & Information Science & Engineering(Affil. Dept. of Electrical & Computer Engineering)University of Florida, Gainesville, Florida Presented at: 2004 Computing Beyond Silicon Summer School (Week 4)California Institute of Technology Pasadena, California, July 6-8, 2004

  2. Abstract • Physical and computational systems share a number of common characteristics. • They are both special cases of the more general concept of dynamical systems. • In fact, we can even show that there is a fundamental underlying unity between the physical and computational domains. • E.g., in this talk, we will survey some ways to understand a variety of key physical concepts in computational terms. • Due to this underlying unity, physical systems have firm limits on their computational capabilities. • Since a computation embedded within a physical system clearly cannot exceed the raw computational capabilities of the physical system itself. • We review some of the known limits. • On information capacity, processing rate, and communication bandwidth.

  3. Physics as Computation • Preview: Most/all physical quantities can be validly reinterpreted in terms of information and computation. • Physical entropyis • Incompressible information. • Physical actionis • Total amount of (quantum-physical) computation. • Physical energyis • The rate of physical computation. • The various different forms of energy correspond to physical computation that is occupied doing different kinds of things. • Physical temperature is (proportional to) • Physical rate of computing per bit of information capacity. • The “clock speed” for physical computation. • Physical momentumis • Amount of “motional” computation per unit distance translated… • there are others for angular momentum, velocity, etc. … • These identities can be made rigorous! • We will sketch the arguments later if there is time…

  4. Fundamental Physics Implies Various Firm Limits on Computing ImpliedUniversal Facts Affected Quantities in Information Processing Thoroughly ConfirmedPhysical Theories Speed-of-LightLimit Communications Latency Theory ofRelativity Information Capacity UncertaintyPrinciple Information Bandwidth Definitionof Energy Memory Access Times QuantumTheory Reversibility 2nd Law ofThermodynamics Processing Rate Adiabatic Theorem Energy Loss per Operation Gravity

  5. Some limits… • Communications latency… • Over distance d is at least t = d/c. • Despite “spooky” non-local-seeming quantum statistics. • Information capacity… • For systems of given size & energy is finite. • Obtained by counting numbers of distinct quantum states. • Information bandwidth… • Limited for flows of given power and cross-sectional area. • Obtained from capacity and propagation velocity limits • Memory access times… • Limited by information density & velocity… • Processing rate… • Limited by accessible energy, indirectly by size. • Also limited by power constraints & energy efficiency. • Energy efficiency… • Limited by Landauer bound for irreversible computing, • No technology-independent limits for reversible computing yet known.

  6. Entropy and Information • The following definitions of the entropy content of a given physical system can all be shown to be essentially equivalent… • Expected logarithm of the state improbability 1/p. • Given a probability distribution over system states. • Expected size of the smallest compressed description of the system’s state. • Using the best available description language & compressor. • Expected amount of information in the state that cannot be reversibly decomputed. • Using the best available mechanism. • Expected amount of a system’s information capacity that is in use • It cannot be used to store newly-computed information for later retrieval.

  7. Action and Amount of Computation • In quantum mechanics, • States are represented as complex-valued vectors v, • & temporal transformations are represented by unitary operators (generalized rotations) U on the vector space. • The U’s may be parameterized as eiHθ(H hermitian, θ real) • We can characterize the magnitude of a given vector rotation Uv = eiHθv by • The area swept out in the complex plane by the normalized vector components as θ is swept from 0 to a given value. • Important conjecture: This quantity is basis-independent! • We can characterize the action performed by a given unitary transform operating on a set of possible v’s as the maximum rotation magnitude over the v’s. • Or, if we have a probability distribution over initial vectors, we can define an expected action accordingly. • The connection with computation is provided by showing that it takes a minimum area (action) of π/4 to flip a bit. • I.e., minimum angle of π/2 to rotate to an orthogonal vector. • It takes a minimum action of π/2 (annihilate/create pair) to move a state forward by 1 position along an unbounded chain. • The total action of a transform gives the total number of such operations.

  8. Energy and Rate of Computation • The energy of an eigenvector of H is the corresponding eigenvalue. • The average energy of a general quantum state follows directly from the eigenstate probabilities. • The average energy is exactly the rate at which complex-plane area is swept out (action accumulated). • In the energy basis, and also in other bases. • Thus, if action is amount of computation, then energy is rate of computation.

  9. Generalized Temperature • The concept of temperature can be generalized to apply even to non-equilibrium systems. • Where entropy is less than the maximum. • Example: Consider an ideal Fermi gas. • Heat capacity/fermion is C = π2k2T/2μ. • μ = Fermi energy; k = log e; T = temperature • Equilibrium temperature turns out to be: • T = (2/πk)(Exμ)1/2, thus C = πk(Ex/μ)1/2 where: Ex = E − E0, avg. energy excess/fermion rel. to T=0 • Equilibirum (max) entropy/fermion is: • Smax = ∫dS = ∫d′Q/T = ∫dEx/T = πk(Ex/μ)1/2 = C • Consider this to be the total information content Smax = Itot = S + X (entropy plus extropy). • We thus have: T = 2(Ex/Itot) • The temperature is simply 2× the excess energy per unit of total information content. • Note that the expression Ex/Itot is well-defined even for non-equilibrium states, where the entropy is S < Smax = Itot. • Thus, we can validly ascribe a (generalized) temperature to such states.

  10. Generalized Temperature as “Clock Speed” • Consider systems such as the Fermi gas, where T= cE/I. • Where c is a constant of integration. • E is excess energy above the ground state. • I is total physical info. content • For such systems, we can say that the generalized temperature gives a measure of the energy content, per bit of physical information content. Eb = c-1Tb =c-1kBT ln 2 • Since energy (we saw) gives the rate of computing, the temperature therefore gives the rate of computing per bit. • In other words, the clock frequency! • For our case c=2, room temperature corresponds to a max. frequency of: fmax = 2c-1Tb/h = kB(300 K)(ln 2)/h = ~4.3 THz • Comparable to freq. of room-T IR photons • A computational subsystem that is at a generalized temperature equal to room temperature can never update its digital state at a higher frequency than this!

  11. Information Limits

  12. Some Quantities of Interest • We would like to know if there are limits on: • Infropy density • = Bits per unit volume • Affects physical size and thus propagation delayacross memories and processors. Also affects cost. • Infropy flux • = Bits per unit area per unit time • Affects cross-sectional bandwidth, data I/O rates, rates of standard-information input & effective entropy removal • Rate of computation • = Number of distinguishable-state changes per unit time • Affects rate of information processing achievable in individual devices

  13. Bit Density: No classical limit • In classical (continuum) physics, even a single particle has a real-valued position+momentum • All such states are considered physically distinct • Each position & momentum coordinate in general requires an infinite string of digits to specify: • x = 4.592181950149194019240194209490124… meters • p = 2.393492340940140914291029091230103… kg m/s • Even the smallest system contains an infinite amount of information!  No limit to bit density. • This picture is the basis for various analog computing models studied by some theoreticians. • Wee problem: Classical physics is dead wrong!

  14. The Quantum “Continuum” • In QM, still  uncountably many describable states (mathematically possible wavefunctions) • Can theoretically take infinite info. to describe • But, not all this info has physical relevance! • States are only physically distinguishable when their state vectors are orthogonal. • States that are only indistinguishably different can only lead to indistinguishably different consequences (resulting states) • due to linearity of quantum physics • There is no physical consequence from presuming an infinite # of bits in one’s wavefunction

  15. Quantum Particle-in-a-Box • Uncountably manycontinuouswavefunctions? • No, can expresswave as a vectorover countablymany orthogonalnormal modes. • Fourier transform • High-frequencymodes have higherenergy (E=hf); energy limits implythey are unlikely.

  16. Ways of Counting States Entire field of quantum statistical mechanics is about this, but here are some simple ways: • For a system w. a constant # of particles: • # of states = numerical volume of position-momentum configuration space (phase space) • in units where h=1. • Approached in macroscopic limit. • Unfortunately, # of particles not usually constant! • Quantum field theory bounds: • Smith-Lloyd bound. Still ignores gravity. • General relativistic bounds: • Bekenstein bound, holographic bound.

  17. Smith-Lloyd Bound Smith ‘95Lloyd ‘00 • Based on counting field modes. • S = entropy, M = mass, V = volume • q = number of distinct particle types • Lloyd’s bound is tighter by a factor of • Note: • Entropy density scales with 3/4 power of mass-energy density • E.g., Increasing entropy density by a factor of 1,000 requires increasing energy density by 10,000×.

  18. Examples w. Smith-Lloyd Bound • For systems at the density of water (1 g/cm3), composed only of photons: • Smith’s example: 1 m3 box holds 6×1034 bits • = 60 kb/Å3 • Lloyd’s example: 1 liter “ultimate laptop”, 2×1031 b • = 21 kb/Å3 • Cool, but what’s wrong with this picture? • Example requires very high temperature+pressure! • Temperature around 1/2 billion Kelvins!! • Photonic pressure on the order of 1016 psi!! • “Like a miniature piece of the big bang.” -Lloyd • Probably not feasible to implement any time soon!

  19. More Normal Temperatures • Let’s pick a more reasonable temperature: 1356 K (melting point of copper): • Entropy density of light only 0.74 bits/m3! • Less than the bit density in a DRAM today! • Bit size comparable to wavelength of optical-frequency light emitted by melting copper • Lesson: Photons are not a viable information storage medium at ordinary temperatures. • Not dense enough. • CPUs that do logic with optical photons can’t have logic devices packed very densely.

  20. Entropy Density of Solids • Can easily calculate from standard empirical thermochemical data. • Obtain entropy by integrating heat capacity ÷ temperature, as temperature increases… • Example result, for copper: • Has one of the highest entropy densities among pure elements at atmospheric pressure • @ room temperature: 6 bits/atom, 0.5 b/Å3 • At boiling point: 1.5 b/Å3 • Cesium has one of the highest #bits/atom at room temperature, about 15. -But only 0.13 b/Å3 • Lithium has a high #bits/mass, 0.7 bits/amu. 1012×denser thanits light! Related toconductivity?

  21. Some Quantities of Interest • We would like to know if there are limits on: • Infropy density • = Bits per unit volume • Affects physical size and thus propagation delayacross memories and processors. Also affects cost. • Infropy flux • = Bits per unit area per unit time • Affects cross-sectional bandwidth, data I/O rates, rates of standard-information input & effective entropy removal • Rate of computation • = Number of distinguishable-state changes per unit time • Affects rate of information processing achievable in individual devices

  22. Smith-Lloyd Bound Smith ‘95Lloyd ‘00 • Based on counting orthogonal field modes. • S = entropy, M = mass, V = volume • q = number of distinct particle types • Lloyd’s bound is tighter by a factor of • Note: • Entropy density scales with 3/4 power of mass-energy density • E.g., Increasing entropy density by a factor of 1,000 requires increasing energy density by 10,000×.

  23. Whence this scaling relation? • Note that in the field theory limit, S E3/4. • Where does this come from? • Consider a typical freq. in field spectrum • Note that the minimum size of agiven wavelet is ~its wavelength . • # of distinguishable wave-packet location states in a given volume  1/3 • Each such state carries a little entropy • occupation number of that state (# of photons in it) • 1/3 particles each energy 1/, 1/4 energy • S1/3  E1/4  SE3/4

  24. Whence the distribution? • Could the use of more particles (with less energy per particle) yield greater entropy? • What frequency spectrum (power level or particle number density as a function of frequency) gives the largest # states? • Note  a minimum particle energy due to box size • No. The Smith-Lloyd bound is based on the blackbody radiation spectrum. • We know this spectrum has the maximum infropy among abstract states, b/c it’s the equilibrium state. • Empirically verified in hot ovens, etc.

  25. General-Relativistic Bounds • The Smith-Lloyd bound does not take into account the effect of gravity. • Earlier bound from Bekenstein: Derives a limit on entropy from black-hole physics: S < 2ER / c E = total energy R = radius of system • Limit only attained by black holes! • Black holes have 1/4 nat entropy per square Planck length of surface (event horizon) area. • Minimum size of a nat: 2 Planck lengths, square 4×1039 b/Å3average ent. dens.of a 1-m radiusblack hole!(MassSaturn)

  26. The Holographic Bound • Based on Bekenstein black-hole bound. • The maximum entropy within any surface of area A (independent of energy!) isA/(2LP)2 • LP is Planck length (see lecture on units) • Implies any 3D object (of any size) could be completely defined via a flat (2D) “hologram” on its surface having Planck-scale resolution. • Bound is only really achieved by a black hole with event horizon=that surface.

  27. Do Black Holes Destroy Information? • Currently, it seems that no one completely understands how information is preserved during black hole accretion for later re-emission as Hawking radiation. • Via infinite time dialation at surface? • Some researchers (e.g. Hawking) claimed that black holes must be doing something irreversible in their interior (destroying information). • The arguments for this seem not very rigorous... • The issue is not completely resolved, but I have many papers on it if you’re interested. • Incidentally, Hawking recently conceded a bet on this.

  28. Implications of Density Limits • Minimum device size • thus minimum communication latency (as per earlier). • Minimum device cost, given a minimum cost of matter/energy. • Implications for communications bandwidth limits (coming up)

  29. Communication Limits • Latency (propagation-time delay) limit from earlier, due to speed of light. • Teaches us scalable interconnection technologies • Bandwidth (infropy rate) limits: • Classical information-theory limit (Shannon) • Limit, per-channel, given signal bandwidth & SNR. • Limits based on field theory (Smith/Lloyd) • Limit given only area and power. • Applies to I/O, cross-sectional bandwidths in parallel machines, and entropy removal rates.

  30. Hartley-Shannon Law • The maximum information rate (capacity) of a single wave-based communication channel is:C = B log (1+S/N) • B = bandwidth of channel in frequency units • S = signal power level • N = noise power level • Law not sufficiently powerful for our purposes! • Does not tell us how many effective channels are possible, given available power and/or area. • Does not give us any limit if we are allowed to increase bandwidth or decrease noise arbitrarily.

  31. Density & Flux • Note that any time you have: • a limit  on density (per volume) of something • & a limit v on its propagation velocity • this automatically implies: • a limit F = v on the flux • by which I mean rate per time per area • Note also we always have a limit c on velocity! • At speeds near c must account for relativistic effects • Slower velocities also relevant: • electron saturation velocity in various materials • velocity of air or liquid coolant in a cooling system • Thus density limit  implies flux limit F=c Cross-section v

  32. Relativistic Effects • For normal matter (bound massive-particle states) moving at a velocity v near c: • Entropy density increases by factor  = (1(v/c)2)1 • Due to relativistic length contraction • But, energy density increases by factor 2 • Both length contraction & mass amplification •  entropy density scales up only w. square root (1/2 power) of energy density from high velocity • Note that light travels at c already, • & its entropy density scales with energy density to the 3/4 power.  Light wins as vc. • If you want to maximize entropy/energy flux

  33. Entropy Flux Using Light Smith ‘95 • FS = entropy flux • FE = energy flux • SB = Stefan-Boltzmann constant, 2kB4/60c23 • Derived from same field-theory arguments as the density bound. • Again, blackbody spectrum optimizes entropy flux given energy flux • It is the equilibrium spectrum

  34. Entropy Flux Examples • Consider a 10cm-wide, flat, square wireless tablet with a 10 W power supply. • What’s it’s maximum rate of bit transmission? • Independent of spectrum used, noise floor, etc. • Answer: • Energy flux 10 W/(2·(10 cm)2) (use both sides) • Smith’s formula gives 2.2×1021 bps • What’s the rate per square nanometer surface? • Only 109 kbps! (ISDN speed, in a 100 GHz CPU?) • 100 Gbps/nm2 nearly 1 GW power! Light is not infropically dense enough for high-BW comms. between densely packed nanometer-scale devices at reasonable power levels!!!

  35. Entropy Flux w. Atomic Matter • Consider liquid copper (~1.5 b/Å3) moving along at a leisurely 10 cm/s… • BW=1.5x1027 bps through the 10-cm wide square! • A million times higher BW than with 10W light! • 150 Gbps/nm2 entropy flux! • Plenty for nano-scale devices to talk to their neighbors • Most of this entropy is in the conduction electrons... • Less conductive materials have much less entropy • Lesson: • For maximum bandwidth density at realistic power levels, encode information using states of matter (electrons) rather than states of radiation (light). Exercise: Kinetic energy flux?

  36. Some Quantities of Interest • We would like to know if there are limits on: • Infropy density • = Bits per unit volume • Affects physical size and thus propagation delayacross memories and processors. Also affects cost. • Infropy flux • = Bits per unit area per unit time • Affects cross-sectional bandwidth, data I/O rates, rates of standard-information input & effective entropy removal • Rate of computation • = Number of distinguishable-state changes per unit time • Affects rate of information processing achievable in individual devices

  37. Speed Limits

  38. The Margolus-Levitin Bound • The maximum rate  at which a system can transition between distinguishable (orthogonal) states is:  4(E  E0)/h • where: • E = average energy (expectation value of energy over all states, weighted by their probability) • E0 = energy of lowest-energy or ground state of system • h = Planck’s constant (converts energy to frequency) • Implication for computing: • A circuit node can’t switch between 2 logic states faster than this frequency determined by its energy.

  39. Example of Frequency Bound • Consider Lloyd’s 1 liter, 1 kg “ultimate laptop” • Total gravitating mass-energy E of 91016 J • Gives a limit of 51050 bit-operations per second! • If laptop contains 21031 bits (photonic maximum), • each bit can change state at a frequency of 2.51019 Hz (25 EHz) • 12 billion times higher-frequency than today’s 2 GHz Intel processors • 250 million times higher-frequency than today’s 100 GHz superconducting logic • But, the Margolus-Levitin limit may be far from achievable!

  40. More Realistic Estimates • Most of the energy in complex stable structures is not accessible for computational purposes... • Tied up in the rest masses of atomic nuclei, • form anchor points for electron orbitals • mass & energy of “core” atomic electrons, • fill up low-energy states not involved in bonding, • & of electrons involved in atomic bonds • needed to hold the structure together • Conjecture: Can obtain tighter valid quantum bounds on infropy densities & state-transition rates by considering only the accessible energy. • Energy whose state-infropy is manipulable.

  41. More Realistic Examples • Suppose the following system is accessible:1 electron confined to a (10 nm)3 volume, at an average potential of 10 V above ground state. • Accessible energy: 10 eV • Accessible-energy density: 10 eV/(10 nm)3 • Maximum entropy in Smith bound: 1.4 bits? • Not clear whether bound is applicable to this case. • Maximum rate of change: 9.7 PHz • 5 million × typical frequencies in today’s CPUs • 100,000 × frequencies in today’s superconducting logics

  42. Summary of Fundamental Limits

More Related