1 / 15

Locking of correlations Debbie Leung U. Waterloo

From: Charles Bennett Date: Sept 06, 2001 Subject: Pictures from Huangshan China Dear Friends,

tierra
Download Presentation

Locking of correlations Debbie Leung U. Waterloo

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. From: Charles Bennett Date: Sept 06, 2001 Subject: Pictures from Huangshan China Dear Friends, Here is a picture of the famous mountain view called CloudDispelling Pavilion, and the curious lovers' locks, which I used in my talk this morning to illustrate the locking and unlocking of quantum information. Lovers' locks are a custom in China and Japan wherein lovers leave a padlock locked onto the infrastructure at some scenic location, signifying eternal love ... Thinking about this custom I wondered if anyone had thought of designing a special lovers’ padlock, without a keyhole or unlocking mechanism. Such a lock would be cheaper to manufacture, and would express greater commitment (no danger of some faithless guy returning at midnight to retrieve the lock and maybe use it with someone else) ... Locking of correlationsDebbie LeungU. Waterloo

  2. Alice : : Bob Can we increase a correlation a lot using little comm? 1 State after the k-th message: ABk 2 f:correlation function l Question: is * possible? f(ABl)-f(ABk) *Àtotal message size taking ABk to ABl ? Not classically, but yes for some f quantumly. WLOG, AB0 uncorrelated. Reasonable normalization: f(ABk) · total comm up to k-th message Interpreting * : correlation f is existing yet inaccessible in ABk and the little extra comm unlocks it!

  3. Locking of correlations -- Survey Entropic uncertainty relations Locking classical mutual information Extension 1: Locking entanglement App: near-max entanglement deficit of state preparation Extension 2: Multi-round locking Application: separation of capacities Applications: - key uncertainty DiVincenzo, M. Horodecki, L, Smolin, Terhal 0303088 Hayden, L, Shor, Winter (Buhrman & Christandl) 0307104, Maassen, Uffink 88 Damgaard, Pedersen, Salvail 0407066 DHLST, Bennett, Devetak, Shor, Smolin 0406086 K, M, P Horodecki, Oppenheim 0404096

  4. Alice MB MA Bob XB XA Definition: Classical Mutual Information Ic AB XA, XB: random variables xa, xb: outcomes Def: Ic(AB) := maxMA­MB I(XA:XB)

  5. MB MA Y X Special case: accessible information Iacc AB X, Y: random variables x, y: outcomes Def: Ic(AB) := maxMA­MB I(X:Y) Special case: AB = x px |xihx| ­x for {|xi} a basis Alice's part is classical, optimal meas MA is along {|xi} Bob is given x wp px (call an "ensemble" E:={px, x}) MB should maximize Bob's information on "x" Max info =: accessible information of the ensemble Iacc(E)

  6. Alice Bob e.g. x = exam questions Exam will be on Wed Locking Ic x (1) Wed X 2R {1,,d} m = log d

  7. Alice Bob Quantumly, ... Locking Ic Ut | i x (1) Sat t (2) Wed X 2R {1,,d} m = log d T 2R{1,,r} k = log r Alice has XT together Each xt occurs with prob 1/(rd), xt = Ut|xihx|Uty Alice’s well-wish: k ¿ m, and Bob learns little about x on Sat Not classically: I(A1A2:B) - I(A1:B) = H(A1A2) + H(B) - H(A1A2B) - [H(A1) + H(B) - H(A1B)] · H(A1A2) - H(A1) + H(A1B) - H(A1A2B) ·H(A2) + 0

  8. Alice Bob Locking Ic Ut | i x (1) Sat t (2) Wed X 2R {1,,d} m = log d T 2R{1,,r} k = log r Iacc(w/o t) =½ logd ·  logd + 3 const k Scheme r=2, U1=I, U2=FT r=(logd)3, Ut2RHaar r=(logd)5, Ut2RHaar Iacc(given t) log d + 1 log d + 3 loglog d log d + 5 loglog d Difference =½ log d ¸ (1-) logd ¸ logd - const Set of {Ut} picked iid & preagreed upon t chosen randomly by Alice during the run d large : logd ¸ O[(1/) log(1/)]

  9. Alice Bob Why is Iacc(w/o t) bounded? Ut | i x (1) Sat X 2R {1,,d} m = log d Ic(AB) = Iacc({1/rd, Ut|xi}) T 2R{1,,r} k = log r WLOG, MB={j|jihj|} (jj=d) (J:= rv for outcome) Iacc = I(J:XT) = H(J)- H(J|XT) = - jj/d log j/d+(1/rd) xtjj|hj|Ut|xi|2 log j|hj|Ut|xi|2 = log d + j (j/d) 1/r tx |hj|Ut|xi|2 log |hj|Ut|xi|2 prob(j|xt) = j|hj|Ut|xi|2 prob(j) = j/d

  10. Why is Ic(ABSat) bounded? Ic(AB) = Iacc({1/rd, Ut|xi}) WLOG, MB={j |jihj|} Ic(AB) = maxMB[log d + jj/d¢1/r xt |hj|Ut|xi|2 log |hj|Ut|xi|2] · max|i[log d – 1/r t (-)x|h|Ut|xi|2 log |h|Ut|xi|2 ] = log d – min|i1/r t Ht Goal: Prove EUR’s min|i1/r tHt¸ , then Ic(AB) · log d -  distribution algebra convexity Measure |i along {Ut|xi} Ht = entropy of outcome “x” Average uncertainty of outcome when measuring |i along a basis randomly chosen from a set Any lower bound (8) is called “entropy uncertainty relation” (EUR) for the set of basis.

  11. Ic(ABSat) · ½ logd ·  logd + 3 const Ic ¸ ½ log d ¸ (1-) logd ¸ logd - const k Scheme r=2, U1=I, U2=FT r=(logd)3, Ut2RHaar r=(logd)5, Ut2RHaar Ic(ABWed) logd + 1 logd + 3 loglogd logd + 5 loglogd Ht = entropy of outcome “x” when measuring |i along {Ut|xi} Proving min|i1/r tHt¸ (so that Ic(AB) · log d - )

  12. Ht = entropy of outcome “x” when measuring |i along {Ut|xi} Proving min|i1/r tHt¸ (so that Ic(AB) · log d - ) For scheme 1: min|i1/r t Ht¸ ½ logd For scheme 2: min|i1/r t Ht¸ (1-) logd - 3 For scheme 3: min|i1/r t Ht¸ logd - const for log d ¸ const/ log(20/) Ic(ABSat) · ½ logd ·  logd + 3 const Ic ¸ ½ logd ¸ (1-) logd ¸ logd - const k Scheme r=2, U1=I, U2=FT r=(logd)3, Ut2RHaar r=(logd)5, Ut2RHaar Ic(ABWed) logd + 1 logd + 3 loglogd logd + 5 loglogd 1 2 3

  13. details needed Ht|i = entropy of outcome “x” when measuring |i along {Ut|xi} Proving min|i1/r tHt¸ (so that Ic(AB) · log d - ) • Pf ideas for: min|i1/r t Ht|i¸ (1-) logd+3 • Pick 1 random state |i • - Levy’s Lemma: Pr[H|i< logd -2] · exp(-»d/(logd)2) • - Chernoff bound: Pr[1/r t Ht|i· (1-/2)(logd -2)] • · exp(-» r d/(logd)2) • Union bound for N |ji’s • - Pr[min|i 1/r t Ht|i· (1-/2)(logd -2)] · N exp(-» r d/(logd)2) • - Take N=(10/)2d , {|ji} s.t. any |i is /2 close to one |ji • For all |i • - Continuity: |||ih|-|jihj|||tr·/2 ) |Ht|i–Ht|ji| ·/2 logd + 1 • - Pr[min|i 1/r t Ht|i· (1-)(logd -3)] · (10/)2d exp(-» r d/(logd)2)

  14. Ht|i = entropy of outcome “x” when measuring |i along {Ut|xi} Proving min|i1/r tHt¸ (so that Ic(AB) · logd - ) Pf ideas for: min|i1/r t Ht|i¸ (1-) logd+3 - Pr[min|i 1/r t Ht|i· (1-)(log d -3)] · (10/)2d exp(-» r d/(log d)2) if Pr <  1, green statement is sometimes false, and the black statement is sometimes true. i.e. 9 {Ut} s.t. the claimed EUR holds r » 1/ (log d)2 log(1/) sufficient which r = (log d)3 OK for  const r = (log d)5 OK for  = 1/log d

  15. Alice Bob Punchline: Ut | i x (1) Sat X 2R {1,,d} m = log d T 2R{1,,r} k = log r Ic(ABSat) · ½ logd ·  logd + 3 const r=2, U1=I, U2=FT r=(logd)3, Ut2RHaar r=(logd)5, Ut2RHaar

More Related