1 / 64

Entropy and Majorisation

Entropy and Majorisation in probabilistic theories -an introduction oscar.dahlsten@physics.ox.ac.uk. Entropy and Majorisation. Entropy and Majorisation. Entropy and Majorisation- Why care.

josehector
Download Presentation

Entropy and Majorisation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Entropy and Majorisation in probabilistic theories-an introductionoscar.dahlsten@physics.ox.ac.uk

  2. Entropy and Majorisation Entropy and Majorisation

  3. Entropy and Majorisation- Why care One of the greatest tech challenges is that current nano-electronics gets as hot as light-bulb filaments, to do with work and heat.

  4. Why care about Entropy and Majorisation

  5. General overview

  6. If the number of messages in the set is finite then this number or any monotonic function of this number can be regarded as a measure of the information produced when one message is chosen from the set, all choices being equally likely. As was pointed out by Hartley the most natural choice is the logarithmic function. Shannon 1948 The great advance provided by information theory lies in the discovery that there is a unique, unambiguous criterion for the the "amount of uncertainty" represented by a discrete probability distribution [...] Jaynes 1956 Part 1: Information Entropy

  7. Probabilities

  8. Conceptual: Probabilities are subjective Entropy S is associated with both an object and an observer

  9. Examples of Entropy measures

  10. Which entropy is best?

  11. Question, how large a memory?

  12. Question, how many uniformly random bits?

  13. Shannon entropy and large n iid limit

  14. Beyond Shannon entropy and large n iid limit General case i.i.d. large n

  15. Question, how much work? A quick rough argument 1 2

  16. Other entropic quantities

  17. Useful way of showing non-negativity f(b) f(pa+(1-p)b) . . pf(a)+(1-p)f(b) f(a) a b pa+(1-p)b

  18. Coarse-graining reduces classical entropy pi pi’ pi + pi’

  19. Coarse-graining reduces classical entropy Next slide: a more general statement

  20. Data processing inequality DPI DPI Research question: what is relation to Baez et al axiomisation of Shannon entropy?

  21. Part 2: Quantum Entropy

  22. Quantum entropy

  23. Quantum entropy as minimal Shannon entropy

  24. Quantum entropy as minimal Shannon entropy PhET

  25. Why coarse graining can increase Quantum entropy

  26. Aside: Quantifying quantum correlations, discord See Modi et. al. for review on discord http://arxiv.org/pdf/1112.6238v3.pdf

  27. What about that data processing inequality? Data processing can only degrade data

  28. DPI holds also in quantum theory

  29. DPI bounds communication rate

  30. Part III: Post-quantum entropy If information theory is a meta-theory it should hold in any probabilistic theory(?)

  31. Entropy of a data table

  32. Data table as state of a system

  33. Data table as state of a system: probabilistic theories

  34. Interesting example x y y “Square bit”-also known as “gbit” x

  35. Violating DPI? Research question: why does S(A|B):=S(AB)-S(B) work with DPI in quantum case but not in general?

  36. Summary

  37. Preview: majorisation

  38. Entropy and Majorisation in probabilistic theories-an introductionPART II: MAJORISATIONoscar.dahlsten@physics.ox.ac.uk

  39. Majorisation: general intro

  40. Majorisation: presentation overview First, a story about wealth inequality

  41. Majorisation and income inequality (Lorenz 1905) Gini index=% of yellow area between equality line and curve Source: Worldbank k

  42. Lorenz curves of probabilities L(x) 1 x

  43. Majorisation in terms of Lorenz curves L(x) Black majorises red 1 L(x) No majorisation 1 x x

  44. Doubly stochastic matrices 1 2

  45. Doubly stochastic matrices: majorisation equiv.

  46. Doubly stochastic matrices

  47. Visualising majorisation geometrically P(1) Accessible with DS matrices P(0)

  48. Why there may be no majorisation in 3D

  49. Majorisation vs entropy

  50. Not other way around!

More Related