1 / 42

Foundations of Privacy Lecture 7

Foundations of Privacy Lecture 7. Lecturer: Moni Naor. Recap of last week’s lecture. Counting Queries Hardness Results Tracing Traitors and hardness results for general (non synthetic) databases. Can We output Synthetic DB Efficiently?. |C|. subpoly. poly. |U|. Not in general. subpoly.

london
Download Presentation

Foundations of Privacy Lecture 7

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Foundations of PrivacyLecture 7 Lecturer:Moni Naor

  2. Recap of last week’s lecture • Counting Queries • Hardness Results • Tracing Traitors and hardness results for general (non synthetic) databases

  3. Can We output Synthetic DB Efficiently? |C| subpoly poly |U| Not in general subpoly ? ? General algorithm poly ? Signatures Hard on Avg. Using PRFs

  4. General output sanitizers Theorem Traitor tracing schemes exist if and only if sanitizing is hard Tight connection between |U|,|C|hard to sanitizeand key, ciphertext sizes in traitor tracing Separation betweenefficient/non-efficient sanitizersuses [BoSaWa] scheme

  5. Traitor Tracing: The Problem • Center transmits a message to a large group • Some Usersleak their keys to pirates • Pirates construct a clone: unauthorized decryption devices • Given a Pirate Box want to find who leaked the keys K1 K3 K8 E(Content) Content Pirate Box Traitors ``privacy” is violated!

  6. Traitor Tracing ! Hard Sanitizing A (private-key) traitor-tracing scheme consists of algorithms Setup, Encrypt, Decrypt and Trace. Setup: generates a key bk for the broadcaster and N subscriber keys k1, . . . , kN. Encrypt: given a bit b generates ciphertext using the broadcaster’s key bk. Decrypt: takes a given ciphertext and using any of the subscriber keys retrieves the original bit Tracing algorithm: gets bk and oracle access to a pirate decryption box. Outputs an i 2 {1, . . . ,N} of a key ki used to create the pirate box Need semantic security!

  7. Simple Example of Tracing Traitor • Let EK(m) be a good shared key encryption scheme • Key generation: generate independent keys for E bk = k1, . . . , kN • Encrypt: for bit b generate independent ciphertexts EK1(b), EK2(b), … EKN(b) • Decrypt: using ki: decrypt ith ciphertext • Tracing algorithm: using hybrid argument Properties: ciphertext length N, key length 1.

  8. Equivalence of TT and Hardness of Sanitizing Traitor Tracing Sanitizing hard for distribution of DBs (collection of) Key Database entry (collection of) Ciphertext Query TT Pirate Sanitizer

  9. Traitor Tracing ! Hard Sanitizing Theorem If exists TT scheme cipher length c(n), key length k(n), can construct: Query set C of size ≈2c(n) Data universe U of size ≈2k(n) Distribution D on n-user databases w\ entries from U D is “hard to sanitize”: exists tracer that can extract an entry in D from any sanitizer’s output Separation betweenefficient/non-efficient sanitizersuses [BoSaWa06] scheme Violate its privacy!

  10. ? Interactive Model query 1 query 2 Sanitizer Data Multiple queries, chosen adaptively

  11. Counting Queries: answering queries interactively DatabaseDof sizen Counting-queries Cis a setof predicates c: U  {0,1} Query: how manyD participants satisfy c ? Relaxed accuracy: answer query withinαadditive errorw.h.p Not so bad:error anyway inherent in statistical analysis • Queries given one by one and should be answered on the fly. Queryc U Interactive

  12. Can we answer queries when not given in advance? • Can always answer with independent noise • Limited to number of queries that is smaller than database size. • We do not know the future but we do know the past! • Can answer based on past answers

  13. Idea: Maintain List of Possible Databases • Start with D0 = list of all databases of size m • Gradually shrinks • Each round j: • if list Dj-1 is representative: answer according to averagedatabase in list • Otherwise: prune the list to maintain consistency Dj-1 Dj

  14. Initialize D0 = {all databases of size m over U}. • Input: x* • Each round Dj-1 = {x1, x2, …} where xi of size m For each query c1, c2, …, ckin turn: • LetAjÃAveragei2Dj-1 min{dcj(x*,xi), T} • If Ajis small: answer according to median db in Dj-1 • DjÃDj-1 • If Ajis large: Give true answer according to x* • remove all db’s that are far away to get Dj Low sensitivity! T ¼ n1- Noisy threshold Plus noise Need two threshold values: aroundT/2

  15. Need to show Accuracy and functionality: • The result is accurate • If Ajis large: many of xi2Dj-1 are removed • Djis never empty Privacy: • Not many large Aj • Can release identity of large rounds • Can release noisy answers in large rounds. How much noise should we add? # of large rounds /

  16. The number of large rounds is bounded • If Aj is large, then must be many sets where the difference with real value is close to T • Assuming actual answer is close to true answer (with added noise): many sets are far from actual answer • Therefore many sets are pruned Size of D0 - ( ) Constant fraction |U| m 0 T Total number of rounds: m log|U| Actual Threshold for small and large Threshold for far

  17. Why is there a good xi Databasex*of sizen Queryc Counting-queries Cis a setof predicates c: U  {0,1} Query: how manyD participants satisfy c ? The sample is assumed to be from x* since we start with all possible sets Existential proof – need not know c1, c2, …, ckin advance Claim: Samplexof sizem approximates x* on all given c1, c2, …, ck U

  18. Size m is Õ(n2/3 log k) For any c1, c2, …, ck: There exist a set x of size m= Õ((n\α)2·log k) s.t. maxj distcj (x,x*) ≤α Forα=Õ(n2/3 log k), distcj(x,x*) = |1/m hcj,x i-1/nhcj,x*i|

  19. Why can we release when large rounds occur? • Do not expect more than O(m log|U| ) large rounds • Make the threshold noisy For every pair of neighboring databases: x* and x’* • Consider vector of threshold noises • Of length k • If a point is far away from threshold – same in both • If close to threshold: can correct at cost • Cannot occur too frequently

  20. Privacy of Identity of Large Rounds Protect: time of large rounds For every pair of neighboring databases: x* and x’* Can pair up noise thershold vectors 12k-1k k+1 12k-1’k k+1 For only a few O(m log|U|) points: x* is above threshold and and x’* below. Can correct threshold value ’k = k +1 Prob≈eε

  21. Summary of Algorithms Three algorithms • BLR • DNRRV • RR (with help from HR)

  22. What if the data is dynamic? • Want to handle situations where the data keeps changing • Not all data is available at the time of sanitization Curator/ Sanitizer

  23. Google Flu Trends “We've found that certain search terms are good indicators of flu activity. Google Flu Trends uses aggregated Google search data to estimate current flu activity around the world in near real-time.”

  24. Example of Utility: Google Flu Trends

  25. What if the data is dynamic? • Want to handle situations where the data keeps changing • Not all data is available at the time of sanitization Issues • When does the algorithm make an output? • What does the adversary get to examine? • How do we define an individual which we should protect? D+Me • Efficiency measures of the sanitizer

  26. DataStreams state • Data is a stream of items • Sanitizer sees each item and updates internal state. • Produces output: either on-the-fly or at the end output Sanitizer Data Stream

  27. Three new issues/concepts • Continual Observation • The adversary gets to examine the output of the sanitizer all the time • Pan Privacy • The adversary gets to examine the internal state of the sanitizer. Once? Several times? All the time? • “User” vs. “Event” Level Protection • Are the items “singletons” or are they related

  28. Randomized Response • Randomized Response Technique [Warner 1965] • Method for polling stigmatizing questions • Idea: Lie with known probability. • Specific answers are deniable • Aggregate results are still valid • The data is never stored “in the plain” “trust no-one” Popular in DB literature Mishra and Sandler. 1 0 1 + + + noise noise noise …

  29. The Dynamic Privacy Zoo Petting User-Level Continual Observation Pan Private Differentially Private Continual Observation Pan Private Randomized Response User level Private

  30. Continual Output Observation state • Data is a stream of items • Sanitizer sees each item, updates internal state. • Produces an output observable to the adversary Output Sanitizer

  31. Continual Observation • Alg - algorithm working on a stream of data • Mapping prefixes of data streams to outputs • Step i output i • Alg isε-differentially private against continual observation if for all • adjacent data streams S and S’ • for all prefixes t outputs 1 2 … t Adjacent data streams: can get from one to the other by changing one element S= acgtbxcde S’= acgtbycde Pr[Alg(S)=1 2 … t] ≤ eε≈1+ε e-ε≤ Pr[Alg(S’)=1 2 … t]

  32. The Counter Problem 0/1 input stream 011001000100000011000000100101 Goal : a publicly observable counter, approximating the total number of 1’s so far Continual output:each time period, output total number of 1’s Want to hide individual increments while providing reasonable accuracy

  33. Counters w. Continual Output Observation state • Data is a stream of 0/1 • Sanitizer sees eachxi, updates internal state. • Produces a valueobservable to the adversary 1 1 1 2 Output Sanitizer 1 0 0 1 0 0 1 1 0 0 0 1

  34. Counters w. Continual Output Observation • Continual output:each time period, output total 1’s • Initial idea: at each time period, on input xi2 {0,1} • Update counter by input xi • Add independent Laplace noise with magnitude 1/ε • Privacy: since each increment protected by Laplace noise – differentially private whether xiis 0 or 1 • Accuracy: noise cancels out, error Õ(√T) • For sparse streams: this error too high. T – total number of time periods 0 -4 -3 -2 -1 1 2 3 4 5

  35. Why So Inaccurate? • Operate essentially as in randomized response • No utilization of the state • Problem: we do the same operations when the stream is sparse as when it is dense • Want to act differently when the stream is dense • The times where the counter is updated are potential leakage

  36. DelayedUpdates Main idea:update output value only when large gap between actual count and output Have a good way of outputting value of counter once: the actual counter + noise. Maintain Actual countAt (+ noise) Current outputoutt(+ noise) D – update threshold

  37. Delayed Output Counter delay Outt - current output At - count since last update. Dt - noisy threshold If At – Dt > fresh noise then Outt+1  Outt + At + fresh noise At+1  0 Dt+1  D + fresh noise Noise: independent Laplace noise with magnitude 1/ε Accuracy: • For threshold D: w.h.p update about N/D times • Total error: (N/D)1/2noise + D + noise + noise • Set D = N1/3  accuracy ~ N1/3

  38. Privacy of Delayed Output At – Dt > fresh noise, Dt+1  D + fresh noise Outt+1Outt +At+ fresh noise Protect: update time and update value For any two adjacent sequences 101101110001 101101010001 Can pair up noise vectors 12k-1k k+1 12k-1’k k+1 Identical in all locations except one ’k = k +1 Where first update after difference occurred Dt D’t Prob≈eε

  39. Dynamic from Static Accumulator measured when stream is in the time frame • Run many accumulators in parallel: • each accumulator: counts number of 1's in a fixed segment of time plus noise. • Value of the output counter at any point in time: sum of the accumulators of few segments • Accuracy: depends on number of segments in summation and the accuracy of accumulators • Privacy: depends on the number of accumulators that a point influences Idea: apply conversion of static algorithms into dynamic ones Bentley-Saxe 1980 Only finished segments used xt

  40. The Segment Construction Based on the bit representation: Each pointt is in dlogtesegments i=1t xi- Sum of at most log t accumulators By setting ’ ¼  / log T can get the desired privacy Accuracy: With all but negligible in Tprobability the error at every stept is at most O((log1.5T)/)). canceling

  41. Synthetic Counter Can make the counter synthetic • Monotone • Each round counter goes up by at most 1 Apply to any monotone function

  42. Petting The Dynamic Privacy Zoo Continual Pan Privacy Differentially Private Outputs Privacy under Continual Observation Pan Privacy Sketch vs. Stream User level Privacy

More Related