1 / 31

Quantifying Location Privacy

Quantifying Location Privacy. Reza Shokri George Theodorakopoulos Jean-Yves Le Boudec Jean-Pierre Hubaux. May 2011. A location trace is not only a set of positions on a map. The contextual information attached to a trace tells much about our habits, interests, activities, and relationships.

kiefer
Download Presentation

Quantifying Location Privacy

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Quantifying Location Privacy Reza Shokri George Theodorakopoulos Jean-Yves Le Boudec Jean-Pierre Hubaux May 2011

  2. A location trace is not only a set of positions on a map The contextual information attached to a trace tells much about our habits, interests, activities, and relationships

  3. envisioningdevelopment.net/map

  4. Distort location information beforeexposing it to others Location-Privacy Protection

  5. Location-Privacy Protection • Anonymization (pseudonymization) • Replacing actual username with a random identity • Location Obfuscation • Hiding location, Adding noise, Reducing precision A common formal framework is MISSING How to evaluate/compare various protection mechanisms? Which metric to use? original low accuracy low precision Pictures from Krumm 2007

  6. Location Privacy:A Probabilistic Framework

  7. Location-Privacy Preserving Mechanism Attacker Knowledge Construction Past Traces(vectors of noisy/missing events) uN u1 Actual Traces(vectors of actual events) Users’ Mobility Profiles MC Transition Matrices Users … u1 uN u2 KC u1 rj Observed Traces(vectors of observed events) Nyms LPPM … 1 uN 2 Pij ri Timeline: … T 1 2 3 4 N Timeline: T 1 2 3 4 Attack Obfuscation Anonymization Reconstructed Traces

  8. Alice Alice Alice Alice Alice Alice Alice Alice Alice Location-Privacy Preserving Mechanism Location-Obfuscation Function: Hiding, Reducing Precision, Adding Noise, Location Generalization,… LPPM Alice A Probabilistic Mapping of a Location to a Set of Locations

  9. Location-Privacy Preserving Mechanism Anonymization Function: Replace Real Usernames with Random Pseudonyms (e.g., integer 1…N) Bob 1 LPPM Alice 3 Charlie 2 A Random Permutation of Usernames

  10. Actual trace of user u Observed trace of user u, with pseudonym u’ Location-Privacy Preserving Mechanism Anonymization Location Obfuscation (for user u) Spatiotemporal Event: <Who, When, Where>

  11. Anonymized and Obfuscated Traces Users’ mobility profiles LPPM PDFanonymization PDFobfuscation Adversary Model Observation Knowledge

  12. Users’ Profiles MC Transition Matrices Past Traces (vectors of noisy/missing past events) uN uN rj u1 u1 KC … Pij ri Learning Users’ Mobility Profiles((adversary knowledge construction)) From prior knowledge, the Attacker creates a Mobility Profile for each user Mobility Profile: Markov Chain on the set of locations Task: Estimate MC transition probabilities Pu

  13. Alice Alice Mobility Profile for Example – Simple Knowledge Construction Prior Knowledge for (this example: 100 Training Traces) How to consider noisy/partial traces? e.g., knowing only the user’s location in the morning (her workplace), and her location in the evening (her home)

  14. Users’ Profiles MC Transition Matrices Past Traces (vectors of noisy/missing past events) uN uN rj u1 u1 KC … Pij ri Learning Users’ Mobility Profiles((adversary knowledge construction)) From prior knowledge, the Attacker creates a Mobility Profile for each user Mobility Profile: Markov Chain on the set of locations Task: Estimate MC transition probabilities Pu Our Solution: Using Monte-Carlo method: Gibbs Sampling to estimate the probability distribution of the users’ mobility profiles

  15. Anonymized and Obfuscated Traces Users’ mobility profiles LPPM PDFanonymization PDFobfuscation Inference Attack Examples Localization Attack: “Where was Alice at 8pm?” What is the probability distribution over the locations for user ‘Alice’ at time ‘8pm’? Tracking Attack: “Where did Alice go yesterday?” What is the most probable trace (trajectory) for user ‘Alice’ for time period ‘yesterday’? Meeting Disclosure Attack: “How many times did Alice and Bob meet?” Aggregate Presence Disclosure: “How many users were present at restaurant x, at 9pm?” Adversary Model Observation Knowledge

  16. Inference Attacks Computationally infeasible:  (anonymization permutation) can take N! values Our Solution: Decoupling De-anonymization from De-obfuscation

  17. Users Nyms u1 1 u2 2 … … uN N De-anonymization 1 - Compute the likelihood of observing trace ‘i’ from user ‘u’, for all ‘i’ and ‘u’, using HMP: Forward-Backward algorithm. O(R2N2T) 2 - Compute the most likely assignment using a Maximum Weight Assignment algorithm (e.g., Hungarian algorithm). O(N4)

  18. De-obfuscation Localization Attack Given the most likely assignment *, the localization probability can be computed using Hidden Markov Model: the Forward-Backward algorithm.O(R2T) Tracking Attack Given the most likely assignment *, the most likely trace for each user can be computed using Viterbi algorithm .O(R2T)

  19. Location-Privacy Metric

  20. How accurate is the estimate? Confidence level and confidence interval How focused is the estimate on a single value? The Entropy of the estimated random variable How close is the estimate to the true value (the real outcome)? Assessment of Inference Attacks In an inference attack, the adversary estimates the true value of some random variable ‘X’ (e.g., location of a user at a given time instant) Let xc (unknown to the adversary) be the actual value of X Three properties of the estimation’s performance:

  21. Location-Privacy Metric The true outcome of a random variable is what users want to hide from the adversary Hence, incorrectness of the adversary’s inference attack is the metric that defines the privacy of users Location-Privacy of user ‘u’ at time ‘t’ with respect to the localization attack = Incorrectness of the adversary (the expected estimation error):

  22. Location-Privacy Meter A Tool to Quantify Location Privacy http://lca.epfl.ch/projects/quantifyingprivacy

  23. Location-Privacy Meter (LPM) • You provide the tool with • Some traces to learn the users’ mobility profiles • The PDF associated with the protection mechanism • Some traces to run the tool on • LPM provides you with • Location privacy of users with respect to various attacks: Localization, Tracking, Meeting Disclosure, Aggregate Presence Disclosure,…

  24. LPM: An Example CRAWDAD dataset • N = 20 users • R = 40 regions • T = 96 time instants • Protection mechanism: • Anonymization • Location Obfuscation • Hiding location • Precision reduction (dropping low-order bits from the x, y coordinates of the location)

  25. LPM: Results – Localization Attack No obfuscation

  26. Assessment of other Metrics K-anonymity Entropy

  27. Conclusion • A unified formal framework to describe and evaluate a variety of location-privacy preserving mechanisms with respect to various inference attacks • Modeling LPPM evaluationas an estimation problem • Throw attacks at the LPPM • The right Metric: Expected Estimation Error • An object-oriented tool (Location-Privacy Meter) to evaluate/compare location-privacy preserving mechanisms http://people.epfl.ch/reza.shokri

  28. Hidden Markov Model Alice PLPPM(6{6,7,8}) PAlice(11) PAlice(116) PAlice(614) 11 6 14 18 12 7 15 19 13 8 16 20

More Related