Create Presentation
Download Presentation

Claude Shannon meets Quantum Mechanics : An Introduction to Quantum Shannon Theory

Claude Shannon meets Quantum Mechanics : An Introduction to Quantum Shannon Theory

257 Views

Download Presentation
Download Presentation
## Claude Shannon meets Quantum Mechanics : An Introduction to Quantum Shannon Theory

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -

**Claude Shannon**meets Quantum Mechanics: An Introduction to Quantum Shannon Theory Mark M. Wilde George Washington University, Math Department Seminar, Friday, September 18, 2009**The Quantum Revolution**Solvay Conference in Brussels 1927 Quantum Theory developed from 1900-1925 Ideas such as Indeterminism, Heisenberg uncertainty, superposition, interference, and entanglement are part of quantum theory**The Information Revolution**In 1948, Claude Shannonrevolutionized the theory of information storage and transmissionwith a breakthrough publication: “Einstein of the Information Age”**The Quantum Information Revolution**Shor Bennett Holevo Schumacher Westmoreland “The Second Quantum Revolution” or “The Second Information Revolution” Ideas such as teleportation, superdense coding, the Schumacher qubit, quantum compression, and capacity of a quantum channel are important here Named Quantum Shannon Theory**Overview**• The Physical Bit vs. the Shannon Bit • The idea of Typical Sequences • Shannon’s Noiseless and Noisy Coding Theorems • Quantum Information • The density operator formalism • The Physical Qubit vs. the Schumacher Qubit • The idea of Typical Subspaces • The Quantum Noiseless and Noisy Coding Theorems**Physical Bit vs. Shannon Bit**Voltage level in transistor Collective spin state on magnetic disk Examples of Physical Bits Shannon bit • Independent of physical medium • Measure of uncertainty of random variable • (units are bits)**Information and Entropy**Given random variable X with outcome x, the surprise is Entropy is the expected surprise: Information Content is a Measure of Surprise**Entropic Quantities**Joint Entropy Conditional Entropy Mutual Information Given two random variables X and Y**Why are Entropies Important?**Classical Compression Classical Channel Coding Entropies are the answer to operational questions**What is Capacity?**The ultimate rate at which two parties can communicate or perform some given task. Capacity theorem has two parts: Direct Coding Theorem – For any rate below capacity (above compression rate), there exists a coding scheme that achieves that rate with vanishing error. Converse Theorem – For any coding scheme with vanishing error, its rate is below capacity (above compression rate)**The Idea of Typical Sequences**Model an information source as a large number of IID random variables: Suppose information source outputs a particular realization: Sample Entropy True Entropy A typical sequence is one for which is close to**Asymptotic Equipartition Property**By the Law of Large Numbers, the sample entropy of a random sequence approaches the true entropy (as n becomes large): Important Implications:**Shannon’s Noiseless Coding Theorem**Typical set has size: Encoding is a mapping that is invertible on the typical set: Compression Rate Is Entropy Proof: Keep only the typical sequences (throw away atypical sequences).**Simple Noisy Channel Model**Bob gets lettery with probability: Alice inputs a letterx Model channel as conditional probability density How much information can Alice transmit to Bob?**The Idea of Conditional Typicality**A given input sequence stochastically maps to an output sequence of a conditionally typical set with high probability Similar to the idea of typical sequences**Shannon’s Channel Code Idea**Induced distribution for Bob is Alice chooses codewords xnrandomly according to distribution pX(x) Bob’s output sequences are typical according to pY(y) Given a particular sequence xn, output sequence is conditionally typical according to pY|X(y|x)**A Random Packing Argument**Can distinguish about 2nI(X;Y) signals**RateR of the code is**Best code then has rate Achieving Capacity Have freedom in choosing the distribution pX(x) for the code**The Physical Quantum Bit**Quantum bit can be or (like a classical bit) Can also be any superposition of these states (unlike a classical bit) Electron Spin Photon Polarization**Physical Qubit vs. Schumacher Qubit**Schumacher Qubit • Independent of physical medium • Measure of uncertainty of quantum state • (units are qubits)**A Qubit Ensemble**A quantum information source outputs quantum states with a certain probability: Examples**The Density Operator Formalism**Can calculate all physical quantities (including probabilities) using the density operator formalism: All ensembles on previous slide have the same density operator**The Spectral Decomposition**Density operator is Hermitian→ can diagonalize it Canonical Ensemble is Ensemble is essentially classical!**Von Neumann Entropy**Von Neumann entropy gives uncertainty of a state: Equal to Shannon entropyof canonical distribution: Formal generalization of Shannon entropy**Quantum Information Source**Describe state as follows: Suppose quantum information source outputs a large number of quantum states**The Idea of Typical Subspaces**Typical subspace supported by all vectors with indices lying in typical set Projector onto typical subspace: Borrow Shannon’s idea of typical sequences and apply to quantum information source Projector defines a quantum measurement**Schumacher Compression**Encoding Isometry Compression Rate**Noisy Quantum Channel Model**Alice inputs a density operator: Bob gets density operator: Model channel as a completely positive, trace-preserving map How much information can Alice transmit to Bob?**Code randomly with an ensemble of the following form:**Hey, that’s my idea!!!! Sending Classical Information over a Quantum Channel Coding Strategy (similar to that for classical case) Use the channel many times so that law of large numbers comes into play Channel input states are product states Allow for small error but show that the error vanishes with large block length Holevo, IEEE Trans. Inf. Theory, 44, 269-273 (1998). Schumacher & Westmoreland, PRA, 56, 131-138 (1997).**Sending Classical Information over a Quantum Channel (ctd.)**Encoder just maps classical signal to a tensor product state Decoder performs a measurement over all the output states to determine transmitted classical signal**Can achieve the following rate (bits/channel use):**where Holevo information of the channel: Capacity of the channel with product input states: Single-letterize!!!! Capacity of the channel with entangled input states: Sending Classical Information over a Quantum Channel (ctd.)**Breaking the Additivity Conjecture**Can entanglement help for encoding classical information? Yes!**Other Notions of Capacity**Quantum Capacity L S D coding Entanglement-Assisted Classical and Quantum Capacities Trade-off problems THANK YOU!