1 / 29

Quantifying Location Privacy

Quantifying Location Privacy. Reza Shokri , George Theodorakopoulos , Jean-Yves Le Boudec and Jean-Pierre Hubaux. About. This paper was written in 2011 , published on IEEE S&P . It counts a significant part in Reza Shokri’s PhD thesis.

bairn
Download Presentation

Quantifying Location Privacy

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Quantifying Location Privacy Reza Shokri, George Theodorakopoulos, Jean-Yves Le Boudec and Jean-Pierre Hubaux

  2. About • This paper was written in 2011, published on IEEE S&P. • It counts a significant part in Reza Shokri’s PhD thesis. • It is recognized as the runner-up for PET Award 2012. • Slides on Prezi: http://prezi.com/w5oogou4ypkl/quantifying-and-protecting-location-privacy/

  3. In this paper • A systematic framework to quantify location privacy protection mechanism (LPPM). • Modeling LPPM • Modeling prior information • Modeling attacks • Claiming correctness is the right metric for user’s privacy. • Certainty? • Accuracy? • Correctness? • Implementing a tool: Location-Privacy Meter. • Assessing the appropriateness of Entropy metric and k-anonymity.

  4. What is Location Privacy • Has been a concern with the rise of Location based services (LBS). • Examples of LBS: • Examples of concerns:

  5. Location Privacy is IMPORTANT • A bunch of points on map? • A trace? • It reveals your habits, interests, relationship and more!

  6. How to protect • By LPPM (Location privacy protection mechanism). • Anonymization • Location/path obfuscation • Perturbation: 12 19 • Dummy: 12 {12, 19, 4, 27} • Reducing precision: 14 Teens • Location hiding: 12

  7. How to attack • Where is s/he at this time? • Track him/her! • Who meets whom at a given place/time? Presence/absence disclosure attack Meeting disclosure attack

  8. Big question: How to model? • A framework : set of mobile users : set of possible actual traces LPPM: location privacy protection mechanism : set of observed traces ADV: adversary METRIC: evaluation metric

  9. Elements of Framework

  10. Modeling Users: event and trace • User moves within Region at time T • U – R – T • An event is defined as a triple • A trace of user u is a T-size vector or event • : the set of all traces that may belong to user. • x x … x : the set of all possible traces of all possible users

  11. Modeling LPPM • An LPPM receives a set of N actual traces, one for each user, and modifies them in two steps: • Obfuscation process: location nyms • An obfuscation mechanism: • Anonymization process: user nyms • In this paper: random permutation • LPPM: ()

  12. LPPM Attacker

  13. Modeling adversary • An adversary is characterized by • Knowledge • Attack • Knowledge • Training traces (incomplete and/or noisy) • Public information • Observable traces released by LPPM • Attack • Presence/absence disclosure attack • Meeting disclosure attack • See details in III.B, III.C }

  14. Which metric for user privacy? • The answer of an attack is not deterministic, but probabilistic, due to the mechanism of LPPM. • Expecting a distribution as the result of attack, so we have: • Uncertainty of a distribution • Accuracy of each element on the distribution • Correctness: distance between the truth and attacker’s estimation.

  15. Which metric for user privacy? • Uncertainty: • Given a distribution, each element seems so uniform! • Uncertainty is high / certainty is low. • Prob. distribution <0.33, 0.33, 0.33>

  16. Which metric for user privacy? • Uncertainty • If someone is outstanding • Uncertainty is low / Certainty is high • Prob. distribution <0.1, 0.1, 0.8>

  17. Which metric for user privacy? • Accuracy is quantified with confidence interval and confidence level.

  18. Which metric for user privacy? • Correctness is what users really care. • User: I do not care your uncertainty. • Even if I am outstanding in your distribution (high certainty), you may not get me right. (E.g. I am @ starbucks while you say am @ McD) • User: I do not care your accuracy. • Even if your are accurate enough, you may still not get me right. • User: what I care is whether you get me right, i.e., correctness.

  19. Uncertainty, accuracy and correctness

  20. So far • Framework for LPPM • Modeling LPPM () • Modeling prior information Not covered • Modeling attacks Not covered • Right Metric • Certainty • Accuracy • Correctness • Implementing a tool: Location-Privacy Meter. • Assessing Entropy metric and k-anonymity.

  21. Implementing Location-Privacy Meter • Modeling prior information • User’s mobility can be modeled as Markov Chain • So, we need transition matrix for MC. • The mobility profile is the transition matrix. • We need !

  22. Approaching

  23. Approaching , statistics details • Gibbs sampling for • Iterative procedure • Dirichlet prior for each row in • Iterative procedure for ET

  24. Implementing Location-Privacy Meter • Modeling attack • Maximum likelihood attack • to find the jointly most likely traces for all users • Distribution tracking attack • Computes the distribution of traces for each user

  25. Attack, algorithm details • Maximum Weight Assignment (MWA) • Hungarian algorithm • Viterbi algorithm • Metropolis Hastings (MH) algorithm

  26. Assessing Entropy metric and k-anonymity

  27. Assessing Entropy metric and k-anonymity

  28. Questions?

More Related