1 / 23

Quantum Shannon Theory

Quantum Shannon Theory. Patrick Hayden (McGill). http://www.cs.mcgill.ca/~patrick/QLogic2005.ppt 17 July 2005, Q-Logic Meets Q-Info. Overview. Part I: What is Shannon theory? What does it have to do with quantum mechanics? Some quantum Shannon theory highlights Part II:

taya
Download Presentation

Quantum Shannon Theory

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Quantum Shannon Theory Patrick Hayden (McGill) http://www.cs.mcgill.ca/~patrick/QLogic2005.ppt 17 July 2005, Q-Logic Meets Q-Info

  2. Overview • Part I: • What is Shannon theory? • What does it have to do with quantum mechanics? • Some quantum Shannon theory highlights • Part II: • Resource inequalities • A skeleton key

  3. Information (Shannon) theory • A practical question: • How to best make use of a given communications resource? • A mathematico-epistemological question: • How to quantify uncertainty and information? • Shannon: • Solved the first by considering the second. • A mathematical theory of communication [1948] The

  4. Quantifying uncertainty • Entropy: H(X) = - xp(x) log2p(x) • Proportional to entropy of statistical physics • Term suggested by von Neumann (more on him soon) • Can arrive at definition axiomatically: • H(X,Y) = H(X) + H(Y) for independent X, Y, etc. • Operational point of view…

  5. {0,1}n: 2n possible strings 2nH(X)typical strings Compression Source of independent copies of X If X is binary: 0000100111010100010101100101 About nP(X=0) 0’s and nP(X=1) 1’s X2 … X1 Xn Can compress n copies of X to a binary string of length ~nH(X)

  6. H(Y) Uncertainty in X when value of Y is known H(X|Y) I(X;Y) Information is that which reduces uncertainty Quantifying information H(X) H(X,Y) H(Y|X) H(X|Y) = H(X,Y)-H(Y) = EYH(X|Y=y) I(X;Y) = H(X) – H(X|Y) = H(X)+H(Y)-H(X,Y)

  7. ´ m’ m Decoding Encoding Shannon’s noisy coding theorem: In the limit of many uses, the optimal rate at which Alice can send messages reliably to Bob through  is given by the formula Sending information through noisy channels Statistical model of a noisy channel:

  8. Shannon theory provides • Practically speaking: • A holy grail for error-correcting codes • Conceptually speaking: • A operationally-motivated way of thinking about correlations • What’s missing (for a quantum mechanic)? • Features from linear structure:Entanglement and non-orthogonality

  9. Quantum Shannon Theory provides • General theory of interconvertibility between different types of communications resources: qubits, cbits, ebits, cobits, sbits… • Relies on a • Major simplifying assumption: Computation is free • Minor simplifying assumption: Noise and data have regular structure

  10. Quantifying uncertainty • Let  = x p(x) |xihx| be a density operator • von Neumann entropy: H() = - tr [ log ] • Equal to Shannon entropy of  eigenvalues • Analog of a joint random variable: • AB describes a composite system A ­ B • H(A) = H(A) = H( trBAB)

  11. No statistical assumptions: Just quantum mechanics! B­ n (aka typical subspace) dim(Effective supp of B­ n ) ~ 2nH(B) Compression Source of independent copies of AB: ­ ­  ­… A A A B B B Can compress n copies of B to a system of ~nH(B) qubits while preserving correlations with A [Schumacher, Petz]

  12. H(B) Uncertainty in A when value of B is known? H(A|B) |iAB=|0iA|0iB+|1iA|1iB Quantifying information H(A) H(AB) H(B|A) H(A|B) = H(AB)-H(B) H(A|B) = 0 – 1 = -1 Conditional entropy can be negative! B = I/2

  13. H(B) Uncertainty in A when value of B is known? H(A|B) I(A;B) Information is that which reduces uncertainty Quantifying information H(A) H(AB) H(B|A) H(A|B) = H(AB)-H(B) I(A;B) = H(A) – H(A|B) = H(A)+H(B)-H(AB) ¸ 0

  14. I(A;B) Data processing inequality(Strong subadditivity) Alice Bob time U  I(A;B) I(A;B)¸ I(A;B)

  15. Encoding ( state) Decoding (measurement) m’ m HSW noisy coding theorem: In the limit of many uses, the optimal rate at which Alice can send messages reliably to Bob through  is given by the (regularization of the) formula where Sending classical information through noisy channels Physical model of a noisy channel: (Trace-preserving, completely positive map)

  16. Encoding ( state) Decoding (measurement) m’ m 2nH(B|A) 2nH(B|A) 2nH(B|A) Sending classical information through noisy channels B­ n 2nH(B) X1,X2,…,Xn

  17. Encoding (TPCP map) Decoding (TPCP map) ‘ |i2 Cd LSD noisy coding theorem: In the limit of many uses, the optimal rate at which Alice can reliably send qubits to Bob (1/n log d) through  is given by the (regularization of the) formula Conditional entropy! where Sending quantum information through noisy channels Physical model of a noisy channel: (Trace-preserving, completely positive map)

  18. Sets of size 2n(I(X;Z)+) All x Random 2n(I(X;Y)-) x Entanglement and privacy: More than an analogy y=y1 y2 … yn x = x1 x2 … xn p(y,z|x) z = z1 z2 … zn How to send a private message from Alice to Bob? Can send private messages at rate I(X;Y)-I(X;Z) AC93

  19. Sets of size 2n(I(X:E)+) All x Random 2n(I(X:A)-) x Entanglement and privacy: More than an analogy |iBE = U­ n|xi UA’->BE­ n |xiA’ How to send a private message from Alice to Bob? Can send private messages at rate I(X:A)-I(X:E) D03

  20. Sets of size 2n(I(X:E)+) All x Random 2n(I(X:A)-) x H(E)=H(AB) Entanglement and privacy: More than an analogy x px1/2|xiA|xiBE UA’->BE­ n x px1/2|xiA|xiA’ How to send a private message from Alice to Bob? SW97 D03 Can send private messages at rate I(X:A)-I(X:E)=H(A)-H(E)

  21. Notions of distinguishability Basic requirement: quantum channels do not increase “distinguishability” Fidelity Trace distance F(,)={Tr[(1/21/2)1/2]}2 T(,)=|-|1 F=0 for perfectly distinguishable F=1 for identical T=2 for perfectly distinguishable T=0 for identical F(,)=max |h|i|2 T(,)=2max|p(k=0|)-p(k=0|)| where max is over POVMS {Mk} F((),()) ¸ F(,) T(,) ¸ T((,()) Statements made today hold for both measures

  22. Conclusions: Part I • Information theory can be generalized to analyze quantum information processing • Yields a rich theory, surprising conceptual simplicity • Operational approach to thinking about quantum mechanics: • Compression, data transmission, superdense coding, subspace transmission, teleportation

  23. Some references: Part I: Standard textbooks: * Cover & Thomas, Elements of information theory. * Nielsen & Chuang, Quantum computation and quantum information. (and references therein) Part II: Papers available at arxiv.org: * Devetak, The private classical capacity and quantum capacity of a quantum channel, quant-ph/0304127 * Devetak, Harrow & Winter, A family of quantum protocols, quant-ph/0308044. * Horodecki, Oppenheim & Winter, Quantum information can be negative, quant-ph/0505062

More Related