1 / 39

Approximate Privacy: Foundations and Quantification

Approximate Privacy: Foundations and Quantification. Michael Schapira (Yale and UC Berkeley) Joint work with Joan Feigenbaum (Yale) and Aaron D. Jaggard (DIMACS). Starting Point: Agents’ Privacy in MD.

felton
Download Presentation

Approximate Privacy: Foundations and Quantification

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Approximate Privacy:Foundations and Quantification Michael Schapira (Yale and UC Berkeley) Joint work with Joan Feigenbaum (Yale) and Aaron D. Jaggard (DIMACS)

  2. Starting Point: Agents’ Privacy in MD • Traditional goal of mechanism design: Incent agents to reveal private information that is needed to compute “good” outcomes. • Complementary, newly important goal: Enable agents not to reveal private information that is not needed to compute “good” outcomes. • Example (Naor-Pinkas-Sumner, EC ’99): It’s undesirable for the auctioneer to learn the winning bid in a 2nd–price Vickrey auction.

  3. Privacy is Important! • Sensitive Information: Information that can harm data subjects, data owners, or data users, if it is mishandled • There’s a lot more of it than there used to be! • Increased use of computers and networks • Increased processing power and algorithmic knowledge • Decreased storage costs • “Mishandling” can be very harmful. • ID theft • Loss of employment or insurance • “You already have zero privacy. Get over it.” (Scott McNealy, 1999)

  4. . . . xn-1 x3 xn x2 x1 Private, MultipartyFunction Evaluation y = f (x1, …, xn) • Each i learns y. • No i can learn anything about xj • (except what he can infer from xiand y ). • Very general positive results.

  5. Drawbacks of PMFE Protocols • Information-theoretically private MFE: Requires that a substantial fraction of the agents be obedient rather than strategic. • Cryptographically private MFE: Requires (plausible but) currently unprovable complexity-theoretic assumptions and (usually) heavy communication overhead. • Not used in many real-life environments • Brandt and Sandholm (TISSEC ’08): Which auctions of interest are unconditionally privately computable?

  6. Minimum Knowledge Requirements for 2nd–Price Auction 0 1 2 3 bidder 2 0 1 2 3 2, 0 1, 0 winner price bidder 1 2, 1 1, 1 2, 2 1, 2 input(2,0) 1, 3 Perfect Privacy Auctioneer learns only which region corresponds to the bids. ≈

  7. Ascending-Price English Auction bidder 2 0 1 2 3 0 1 2 3 bidder 1 Same execution for the inputs (1,1), (2,1), and (3,1)

  8. Perfect Privacy for 2nd–Price Auction[Brandt and Sandholm (TISSEC ’08)] • The ascending-price, English-auction protocol is perfectly private. • It is essentially the only perfectly privateprotocol for 2nd–price auctions. • Note the exponential communication cost of perfect privacy!

  9. Worse Yet…(The Millionaires’ Problem) millionaire 2 x2 0 1 2 3 0 1 2 3 millionaire 1 x1 f(x1,x2) = 1 if x1 ≥ x2 ; else f(x1,x2) = 2 The Millionaires’ Problem is not perfectly privately computable.[Kushilevitz (SJDM ’92)]

  10. So, What Can We Do? • Insist on achieving perfect privacy. • sometimes there is no reasonable alternative • can be costly (communication, PKI, etc.) • Treat privacy as a design goal. • alongside complexity, optimization, etc. • We need a way to quantify privacy.

  11. Privacy Approximation Ratios (PARs) • Intutitively, captures the indistinguishability of inputs. • natural first step • general distributed function computation • Other possible definitions: • Semantic (context-specific) • Entropy-based

  12. Outline • Background • Two-party communication (Yao) • “Tiling” characterization of privately computable functions (Chor + Kushilevitz) • Privacy Approximation Ratios (PARs) • Bisection auction protocol: exponential gap between worst-case and average-case PARs • Summary of Our Results • Open Problems

  13. Two-party Communication Model Party 1 Party 2 x1 {0, 1}k x2  {0, 1}k f: {0,1}k x {0,1}k {0,1}m q1 qj {0,1} is a function of (q1, …, qj-1) and one player’s private input. q2 ••• qr-1 qr = f(x1, x2) Δ s(x1,x2) = (q1,…,qr)

  14. Example: Millionaires’ Problem 0 1 2 3 millionaire 2 1 0 1 2 3 2 2 2 millionaire 1 1 1 2 2 A(f) 1 1 1 2 1 1 1 1 f(x1,x2) = 1 if x1 ≥ x2 ; else f(x1,x2) = 2

  15. Monochromatic Tilings • A region of A(f) is any subset of entries (not necessarily a submatrix).A partition of A(f) is a set of disjoint regions whose union is A(f). • A rectangle in A(f) is a submatrix.A tiling is a partition into rectangles. • Monochromatic regions and partitions

  16. Bisection Protocol In each round, a player “bisects” an interval. millionaire 2 0 1 2 3 0 1 2 3 millionaire 1 Example: f(2,3) A communication protocol “zeroes in” on a monochromatic rectangle.

  17. Perfectly Private Protocols • Protocol P for f is perfectly private with respect to party 1 if f(x1, x2) = f(x’1, x2) s(x1, x2) = s(x’1, x2) • Similarly, perfectly private wrt party 2 • P achieves perfect subjective privacy if it is perfectly private wrt both parties. • P achieves perfect objective privacyif f(x1, x2) = f(x’1, x’2) s(x1, x2) = s(x’1, x’2)

  18. Ideal Monochromatic Partitions • The ideal monochromatic partition of A(f) consists of the maximal monochromatic regions. • This partition is unique. 0 1 2 3 1 0 1 2 3 2 2 2 1 1 2 2 1 1 1 2 1 1 1 1

  19. Characterization of Perfect Privacy • Protocol P for f is perfectly privacy-preservingiffthe tiling induced by P is the ideal monochromatic partition of A(f). 0 1 2 3 bidder 2 0 1 2 3 2, 0 1, 0 winner price bidder 1 2, 1 1, 1 2, 2 1, 2 1, 3

  20. Objective PAR (1) • Privacy with respect to an outside observer • e.g., auctioneer • Worst-case objective PAR of protocol P for function f: • Worst-case PAR of fis the minimum, over all P for f, of worst-case PAR of P. |R (x1, x2)| |R (x1, x2)| I MAX (x1, x2) P

  21. Objective PAR (2) • Average-case objective PAR of P for f wrt distribution D on {0,1}k x {0,1}k : • Average-case PAR of f is the minimum, over all P for f, of average-case PAR of P. [ ] |R (x1, x2)| |R (x1, x2)| I ED P

  22. Bisection Auction Protocol (BAP)[Grigorieva, Herings, Muller, & Vermeulen (ORL’06)] • Bisection protocol on [0,2k-1] to find an interval [L,H] that contains lower bid but not higher bid. • Bisection protocol on [L,H] to find lower bid p. • Sell the item to higher bidder for price p.

  23. Bisection Auction Protocol (BAP) bidder 2 0 1 2 3 4 5 6 7 0 1 2 3 4 5 6 7 bidder 1 A(f) Example: f(7, 4)

  24. Objective PARs for BAP(k) • Theorem: Average-case objective PAR of BAP(k) with respect to the uniform distribution is+1. • Observation: Worst-case objective PAR of BAP(k) is at least 2 . • Conjecture: The average-case objective PAR of 2nd-Price-Auction(k) is linear in k wrt all distributions. k 2 k/2

  25. Proof (1) 0 2k-1 2k-1 • ak= number of rectangles in induced tiling for BAP(k). • a0=1, ak = 2ak-1+2k ak = (k+1)2k 0 Δ 2k-1 2k-1 The monochromatic tiling induced by the Bisection Auction Protocol for k=4

  26. Proof (2) Δ • R = {R1,…,Ra } is the set of rectangles in the BAP(k) tiling • RI = rectangle in the ideal partition that contains Rs • js= 2k - |RI| • bk=SR js k Δ s Δ s Δ s

  27. Proof (3) |RI(x1,x2)| (+) 1 • PAR = S • = S = S 22k |RBAP(k)(x1,x2)| (x1,x2) . |RI| 1 1 s |Rs| |RI| 22k 22k |Rs| s Rs Rs number of (x1,x2)’s in Rs contribution to (+) of one (x1,x2) in Rs

  28. Proof (4) 0 2k-1 2k-1 • bk = bk-1+(bk-1+ak-12k-1) + ( S i ) + ( S i ) • b0=0, bk=2bk-1+(k+1)22(k-1) bk = k22k-1 0 2k-1-1 2k-1 2k-1 i=0 i=1 2k-1 The monochromatic tiling induced by the Bisection Auction Protocol for k=4

  29. Proof (5) S= S (2k-js) = (ak2k-bk) = ( (k+1)22k- k22k-1 ) = k+1- = + 1 1 1 |RI| 22k s 22k 1 22k 1 22k k 2 k 2 QED

  30. Bounded Bisection Auction Protocol (BBAP) BBAP(r): • Do (at most) r bisection steps. • If the winner is still unknown, run the ascending English auction protocol on the remaining interval. • Ascending auction protocol: BBAP(0)Bisection auction protocol: BBAP(k)

  31. Average-Case Objective PARs for 2nd-price Auction Protocols 4 2k+1 8 2 16 2 +1 2 (3*2k) 3

  32. Subjective PARs • Objective privacy= privacy wrt an outside observer • Subjective privacy =privacy wrt the other party • In the millionaires’ problems we (mainly) care about subjective privacy. • Similar definitions.

  33. Subjective PARs (1) • The 1-partition of region R in matrix A(f): { Rx1 = {x1} x {x2s.t. (x1, x2)  R} } (similarly, 2-partition) • The i-induced tiling of protocol P for f is obtained by i-partitioning each rectangle in the tiling induced by P. • The i-ideal monochromatic partitionof A(f) is obtained by i-partitioning each region in the ideal monochromatic partition of A(f).

  34. Subjective PARs (1) The 1-partition of region R in matrix A(f): { Rx1 = {x1} x {x2s.t. (x1, x2)  R} } (similarly, 2-partition) millionaire 2 0 1 2 3 0 1 2 3 I I I R1 (0, 1) = R1 (0, 2) = R1 (0, 3) I I R1 (1, 2) = R1 (1, 3) millionaire 1 P (Ri defined analogously for protocol P)

  35. Subjective PARs (2) • Worst-case PAR of protocol P for f wrti: • Worst-case subjective PAR of P for f: maximize over i {1, 2} • Worst-case subjective PAR of f: minimize over P • Average-case subjective PAR wrt distribution D: use ED instead of MAX |Ri (x1, x2)| |Ri (x1, x2)| MAX (x1, x2) I P

  36. Average-Case PARs for the Millionaires Problem 1 2 k +1 1 2 2

  37. Other Results • More PARs for these problems. • PARs of other problems • public-good • truthful-public-good[Babaioff-Blumrosen-Naor-Schapira] • set-disjointness • set-intersection • Other notions of privacy: first steps • Semantic definitions( What is better, {1, 8} or {4, 5} ? ) • Entropy-based definitions

  38. Open Problems • Upper bounds on non-uniform average-case PARs • Prove/refute our conjecture! • Lower bounds on average-case PARs • PARs of other functions of interest • Extension to n-party case • Other definitions of PAR • We take first steps in this direction. • Relationship between PARs and h-privacy[Bar-Yehuda, Chor, Kushilevitz, and Orlitsky (IEEE-IT ’93)]

  39. Thank You

More Related