1 / 40

What is information? Insights from Quantum Physics

What is information? Insights from Quantum Physics. Benjamin Schumacher Department of Physics Kenyon College. English The wine is acceptable but the meat is underdone. Furbish Too-wee sah mo-ko gah no-tay fah-so-so. Lunchtime conversation What I tell my friends. ??. Translation comedy.

vivien-kidd
Download Presentation

What is information? Insights from Quantum Physics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. What is information?Insights from Quantum Physics Benjamin Schumacher Department of Physics Kenyon College

  2. English The wine is acceptable but the meat is underdone. Furbish Too-wee sah mo-ko gah no-tay fah-so-so. Lunchtime conversation What I tell my friends. ?? Translation comedy English The spirit is willing but the flesh is weak. Physics of information QIT/QC, thermo., black holes, etc.

  3. What we’re up to • We wish to identify universal ideas about “information” • Parallel to category theory (“general nonsense”) • A reasonable question: Is this useful? • We are not necessarily trying to quantify “information” • (Not yet, anyway) • A single quantity may not be enough to capture every aspect of “information” • Nevertheless, we may find some useful quantities that help describe the “information structure”

  4. Three heuristics Information is . . . . • Physical. Landauer: Information is always associated with the state of a physical system. Information refers to the relations among subsystems of a composite physical system. • Relational. • Fungible. Bennett: Information can be transformed from one representation to another. Information is a property that is invariant under such transformations.

  5. = Kenyon Q-News Invariance Topology Information theory =

  6. Q-News YES! Q-News NO! Q-News referent B signal A MAYBE! Reading the newspaper online What I get: Electrical signal What it means: Today’s newspaper Many different “messages” (referent states) are possible. A priori situation described by a probability distribution. Information resides in the correlations of signal and referent.

  7. signal A referent B signal distortion noise due to environment Q-News Q-News signal processing error correction digitization decoding Communication theory Key point: All of this stuff happens to the signal only.

  8. B 1 2 3 1 A 2 Q-News 3 Signal and referent • Random variables A and B • A = “signal” • B = “referent” • “A carries information about B.” • Key points: • Information resides in the correlation of A and B. • In communication processes, only A is affected by operations.

  9. B 1 2 3 1 A 2 3 conditional probabilities Local A-operations “Local” A-operations: Same operation on each column.

  10. B 1 2 3 1 A 2 3 conditional probabilities Local A-operations “Local” A-operations: Same operation on each column. Row permutation: Reversible!

  11. B 1 2 3 1 A 2 3 conditional probabilities Local A-operations “Local” A-operations: Same operation on each column. Row “blurring”: Irreversible!

  12. P K(P) “Information structure” • Set T of possible “operations” includes all local A-operations. • P  P’ means that P’ = K(P) for some K  T . • P and P’ are equivalent (have “the same information”) iff P  P’ and P’  P. • Natural “information structure” • partial ordering on states (really equivalence classes) • reversible and irreversible operations within T “states” = joint AB distributions

  13. Monotones A functional f is called a monotone iff f (P)  f (K(P)) for all states P and operations K  T . Entropy (Shannon): Mutual information: The mutual information is a monotone: An operation is reversible if and only if Mutual information I is an “expert” monotone.

  14. signal referent State vector Density operator Mixed state Quantum communication theory A and B are quantum systems. Composite system AB described by state AB (density operator) Restrict T to local A-operations ( maps of the form E  1 ) A B “A carries quantum information about B .”

  15. “Information structure” BAB A single “leaf”: All AB states with a given B = trA AB T = { E  1 } (States stay on the leaf under any A-operation.)

  16. The lay of the leaf B “leaf” • Maximal information states • Pure joint states  of AB •    for all  ,  on leaf •   AB for all AB on leaf • May include other (mixed) states. T = { E  1 } Minimal information: Product states AB = A  B

  17. high I contours low Reversibility B “leaf” Quantum entropy (von Neumann): Coherent information • For any operation, I is non-increasing. ( I is a monotone.) • An operation is reversible if and only if I is unchanged. ( I is expert.) • If we start with a maximal state and I I -  , then we can approximately reverse the operation. T = { E  1 }

  18. The most important slide in this talk • Our concept of information depends on: • The set of possible states of our system. • The set T of “possible operations” on our system. (T should be a semigroup with identity.) • The set T will determine what we mean by “information”. • In a given situation, it will be the limitations imposed on the set T that make things interesting.

  19. Three different information theories • I. A pair of systems AB with only local A-operations • Communication theory (message A + referent B) • II. Large systems with operations that only affect a few macroscopic degrees of freedom • Thermodynamics • A pair of quantum systems AB with local operations and classical communication (LOCC) • Quantum entanglement

  20. Local operations, classical communication Composite quantum system AB Subsystems A and B are located in “separate laboratories” A B • Operations in LOCC: • We may perform any quantum operations (including any measurement processes) on A and B separately. • We may exchange ordinary (classical) messages about the results of measurements. • If A and B were classical systems, these would be enough to do any operation at all – but not for quantum systems . . . .

  21. Entangled states BAB • T = LOCC. • Minimal states: • product states • separable states States that are not separable are called entangled states. Example: Pure entangled state Bell’s theorem (J. Bell, 1964) The statistical correlations between entangled systems cannot be simulated by any separated classical systems. (“Quantum non-locality”) skip

  22. Q-News Monogamy of entanglement Classical systems: The fact that B is correlated to A does not prevent B from being correlated to other systems. Quantum systems: If A and B are in a pure entangled state, then we know that there can be no other system in the whole Universe that is entangled with either A or B . Many copies of A may exist, each with the same relation to the referent system B . We can even make copies of A . Entanglement is monogamous. The fundamental difference between classical and quantum information?

  23. B A1 A2 Copyable states Initial joint state AB (here A = A1) Introduce A2 in a standard state Operate only on A1 and A2 Final state If we can do this, then we say that AB is “copyable” (on A). All copyable states are separable . . . . . . . . but not all separable states are copyable! Some states are copyable on B but not on A, or vice versa.

  24. B A1 A2 An Sharable states Does there exist a state of A1...AnB such that If this is possible, then we say that AB is “n-sharable” (on A) If this is possible for every integer n , we say that AB is -sharable (or just plain “sharable”) on A.

  25. Our ability to make a copy The possible existence of a copy  AB states 1-sharable (all states) 2-sharable 3-sharable -sharable (“sharable”) copyable Sharable states

  26. Copyable, sharable, separable • All copyable quantum states are also sharable. • Pretty obvious; to show the existence of copies, we can simply make them. • All separable AB states are sharable: • Two remarkable facts • For any n , there is an n-sharable state that is not (n+1)-sharable. • All sharable (-sharable) states are separable! BS & R. Werner A. Doherty & F. Spedalieri

  27. Copyable on A only Not 2-sharable on A or B . . . . -sharable Copyable on B only finite sharability Really quantum Really classical Mappa mundi • We must distinguish between • The ability to create copies (“copyability”) • The possible existence of copies (“sharability”) • Finite and infinite sharability • These distinctions are richer and far more interesting than simply “classical” versus “quantum”. Copyable on both A and B skip

  28. What is computation? • Information processing (“computation”) is a physical process – that is, it is always realized by the dynamical evolution of a physical system. • How do we classify different computation processes? • When can we say that two evolutions “do the same computation”? • Key idea: One process can simulate another.

  29. E r E(r) C D F Simulation We say that F simulates E (F E) on G if there exist C and D such that E (r) = D◦ F ◦ C (r) for all rG .

  30. C D F Simulation E G E(G) We say that F simulates E (F E) on G if there exist C and D such that the above diagram commutes. N.B. – This is a very “primitive” idea of simulation. It will require refinement for many specific applications!

  31. Abstract computation State preparation of physical device Measurement on final device state Dynamical evolution of device Physical computation Input of abstract computer Result of computation

  32. AB interaction accomplishes some communication task Joint AB states rAB C D F Communication E G E(G) • It would be cheating to hide additional communication in “coding” and “decoding” • Require C = CA  CB, D = DA  DB

  33. C D F Complexity E G E(G) • We wish to compare the “length” or “cost” of the processes. • Require that C and D be “short” or “cheap”.

  34. C D F Computations and translations E G E(G) • Require that E, F C(“computations”) • Require that C, D T  C (“coding” and “decoding” operations) • Given C and T, when can FsimulateEonG ?

  35. E E(B) B C D F Maximal and minimal operations Simplest case T = C = all quantum operations on a particular system When can F E ? Maximal operations If F is unitary, then it can simulate any operation E. Minimal operations If E is constant (i.e., E(r) = s for all rB) then it can be simulated by any F

  36. Suppose X is a function of r and E such that E E(B) B C D F Let Simulation monotones (i.e., X is a monotone for processes in C.) Moral F  E only if X*(E)  X*(F)

  37. Some intuition • Computation • “Computations” C • Coding and decoding operations (T C) • Simulation monotone X* • (non-increasing under C) • Information • States {r, s, . . . } • Allowed operations T • Monotone M(r) • (non-increasing under T) M is something like the “information content” of r with respect to T. X* is something like the “information capacity” of Ewith respect to C and T.

  38. Summing up • Information is physical, relational and fungible. • Our concept of information depends on the set T of operations that we may perform. • Information may be “preserved” (reversible) or “lost” (irreversible). Monotones can help us distinguish these situations. • Computation is based on the idea that we can simulate one process by another. “Capacity” quantities can help us distinguish whether this is possible.

  39. A few things not addressed • Asymptotic limits (large N , F 1) • Quantifying resources required to perform “information” tasks • The “CC” part of LOCC • Measures of entanglement, fidelity and “nearness”, complexity of operations, etc. • “It from bit”, Bayesian approaches, etc. • Thermodynamics! • How I’m really going to explain all this to my friends.

  40. Finis

More Related