1 / 32

Entropy in the Quantum World

Entropy in the Quantum World. Panagiotis Aleiferis EECS 598, Fall 2001. Outline. Entropy in the classic world Theoretical background Density matrix Properties of the density matrix The reduced density matrix Shannon’s entropy Entropy in the quantum world Definition and basic properties

ansel
Download Presentation

Entropy in the Quantum World

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Entropy in theQuantum World Panagiotis Aleiferis EECS 598, Fall 2001

  2. Outline • Entropy in the classic world • Theoretical background • Density matrix • Properties of the density matrix • The reduced density matrix • Shannon’s entropy • Entropy in the quantum world • Definition and basic properties • Some useful theorems • Applications • Entropy as a measure of entanglement • References

  3. Entropy in the classic world • Murphy’s Laws If something can go wrong, it will! The more we complicate the plan, the greater the chance of failure. Nothing is ever so bad, that it can't get worse.

  4. Why does heat always flow from warm to cold? 1st law of thermodynamics: 2nd law of thermodynamics: “There is some degradation of the total energy U in the system, some non-useful heat, in any thermodynamic process.” ΔQ ΔW ΔU Rudolf Clausius (1822 - 1988)

  5. The more disordered the energy, the less useful it can be! “When energy is degraded, the atoms become more disordered, the entropy increases!” “At equilibrium, the system will be in its most probable state and the entropy will be maximum.” Ludwig Boltzmann (1844 - 1906)

  6. Four heads Four tails All possible microstates of 4 coins Three heads, one tails Two heads, two tails One heads, three tails

  7. Boltzmann statistics – 5 dipoles in external field

  8. General Relations of Boltzmann statistics • For a system in equilibrium at temperature T: • Statistical entropy:

  9. Theoretical Background • The density matrix ρ • In most cases we do NOT completely know the exact state of the system. We can estimate the probabilities Pi that the system is in the states |ψi>. • Our system is in an “ensemble” of pure states {Pi,|ψi>}.

  10. Define: tr(ρ)=1

  11. Properties of the density matrix • tr(ρ)=1 • ρ is a positive operator (positive, meansis real, non-negative, ) • if a unitary operator U is applied, the density matrix transforms as: • ρ corresponds to a pure state, if and only if: • ρ corresponds to a mixed state, if and only if:

  12. if we choose the energy eigenfunctions for our basis set, then H and ρ are both diagonal, i.e. • in any other representation ρ may or may not be diagonal, but generally it will be symmetric, i.e. Detailed balance is essential so that equilibrium is maintained (i.e. probabilities do NOT explicitly depend on time).

  13. The reduced density matrix • What happens if we want to describe a subsystem of the composite system? • Divide our system AB into parts A, B. • Reduced density matrix for the subsystem A: where trB: “partial trace over subsystem B” trace over subspace of system B

  14. Shannon’s entropy • Definition • How much information we gain, on average, when we learn the value of a random variable X? OR equivalently, What is the uncertainty, on average, about X before we learn its value? • If {p1, p2, …,pn} the probability distribution of the n possible values of X:

  15. By definition: 0log20 = 0 (events with zero probability do not contribute to entropy.) • Entropy H(X) depends only on the respective probabilities of the individual events Xi ! • Why is the entropy defined this way? It gives the minimal physical resources required to store information so that at a later time the information can be reconstructed. - “Shannon’s noiseless coding theorem”.

  16. Example of Shannon’s noiseless coding theorem Code 4 symbols {1, 2, 3, 4} with probabilities {1/2, 1/4, 1/8, 1/8}. Code without compression: But, what happens if we use this code instead? Average string length for the second code: Note:!!!

  17. Joint and Conditional Entropy • A pair (X,Y) of random variables. • Joint entropy of X and Y: • Entropy of X conditional on knowing Y: • Mutual Information • How much do X, Y have in common? • Mutual information of X and Y:

  18. H(X) H(Y) • , equality when Y= f(X) • Subadditivity: , equality when X, Y are independent variables. H(X|Y) H(Y:X) H(Y|X)

  19. Entropy in the quantum world • Von Neumann’s entropy • Probability distributions replaced by the density matrix ρ. Von Neumann’s definition: • If λi are the eigenvalues of ρ, use the equivalent definition:

  20. Basic properties of Von Neumann’s entropy • , equality if and only if in “pure state”. • In a d-dimensional Hilbert space: , the equality if and only if in a completely mixed state, i.e. • If system AB in a “pure state”, then:

  21. Triangle inequality and subadditivity: with Both these inequalities hold for Shannon’s entropy H.

  22. Strong subadditivity First inequality also holds for Shannon’s entropy H, since: BUT, for Von Neumann’s entropy it is possible that: However, somehow nature “conspires” so that both of these inequalities are NOT true simultaneously!

  23. Applications • Entropy as a measure of entanglement • Entropy is a measure of the uncertainty about a quantum system before we make a measurement of its state. • For a d-dimensional Hilbert space: Pure state Completely mixed state

  24. Example: Consider two 4-qbit systems with initial states: Which one is more entangled ?

  25. Partial measurement randomizes the initially pure states. • The entropy of the resulting mixed states measures the amount of this randomization! • The larger the entropy, the more randomized the state after the measurement is, the more entangled the initial state was! • We have to go through evaluating the density matrix of the randomized states:

  26. System 1: Trace over (any) 1 qbit: Trace over (any) 2 qbits: Pure state λ1,2=0, λ3,4=1/2

  27. λ1,2=1/2 Trace over (any) 3 qbits: Summary: 1. initially 2. measure (any) 1 qbit 3. measure (any) 2 qbits 4. measure (any) 3 qbits

  28. System 2: Trace over (any) 1 qbit: diagonal

  29. Trace over (any) 2 qbits: Trace over (any) 3 qbits: λ1=0, λ2,3=1/6, λ4=2/3 λ1,2=1/2

  30. Summary: 1. initially 2. measure (any) 1 qbit 3. measure (any) 2 qbits 4. measure (any) 3 qbits Therefore,ψ2is more entangled than ψ1.

  31. “Ludwin Boltzmann, who spent much of his life studying statistical mechanics, died in 1906, by his own hand. Paul Ehrenfest, carrying on the work, died similarly in 1933. Now it is our turn to study statistical mechanics.” - “States of Matter”, D. Goodstein

  32. References • “Quantum Computation and Quantum Information”, Nielsen & Chuang, Cambridge Univ. Press, 2000 • “Quantum Mechanics”, Eugen Merzbacher, Wiley, 1998 • Lecture notes by C. Monroe (PHYS 644, Univ. of Michigan) coursetools.ummu.umich.edu/2001/fall/physics/644/001.nsf • Lecture notes by J. Preskill (PHYS 219, Caltech) www.theory.caltech.edu/people/preskill/ph229

More Related