quantifying location privacy
Download
Skip this Video
Download Presentation
Quantifying Location Privacy

Loading in 2 Seconds...

play fullscreen
1 / 29

Quantifying Location Privacy - PowerPoint PPT Presentation


  • 185 Views
  • Uploaded on

Quantifying Location Privacy. Reza Shokri , George Theodorakopoulos , Jean-Yves Le Boudec and Jean-Pierre Hubaux. About. This paper was written in 2011 , published on IEEE S&P . It counts a significant part in Reza Shokri’s PhD thesis.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' Quantifying Location Privacy' - bairn


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
quantifying location privacy

Quantifying Location Privacy

Reza Shokri, George Theodorakopoulos, Jean-Yves Le Boudec and Jean-Pierre Hubaux

about
About
  • This paper was written in 2011, published on IEEE S&P.
  • It counts a significant part in Reza Shokri’s PhD thesis.
  • It is recognized as the runner-up for PET Award 2012.
  • Slides on Prezi: http://prezi.com/w5oogou4ypkl/quantifying-and-protecting-location-privacy/
in this paper
In this paper
  • A systematic framework to quantify location privacy protection mechanism (LPPM).
    • Modeling LPPM
    • Modeling prior information
    • Modeling attacks
  • Claiming correctness is the right metric for user’s privacy.
    • Certainty?
    • Accuracy?
    • Correctness?
  • Implementing a tool: Location-Privacy Meter.
  • Assessing the appropriateness of Entropy metric and k-anonymity.
what is location privacy
What is Location Privacy
  • Has been a concern with the rise of Location based services (LBS).
    • Examples of LBS:
    • Examples of concerns:
location privacy is important
Location Privacy is IMPORTANT
  • A bunch of points on map?
  • A trace?
  • It reveals your habits, interests, relationship and more!
how to protect
How to protect
  • By LPPM (Location privacy protection mechanism).
    • Anonymization
    • Location/path obfuscation
    • Perturbation: 12 19
    • Dummy: 12 {12, 19, 4, 27}
    • Reducing precision: 14 Teens
    • Location hiding: 12
how to attack
How to attack
  • Where is s/he at this time?
  • Track him/her!
  • Who meets whom at a given place/time?

Presence/absence disclosure attack

Meeting disclosure attack

big question how to model
Big question: How to model?
  • A framework

: set of mobile users

: set of possible actual traces

LPPM: location privacy protection mechanism

: set of observed traces

ADV: adversary

METRIC: evaluation metric

modeling users event and trace
Modeling Users: event and trace
  • User moves within Region at time T
  • U – R – T
  • An event is defined as a triple
  • A trace of user u is a T-size vector or event
  • : the set of all traces that may belong to user.
  • x x … x : the set of all possible traces of all possible users
modeling lppm
Modeling LPPM
  • An LPPM receives a set of N actual traces, one for each user, and modifies them in two steps:
    • Obfuscation process: location nyms
      • An obfuscation mechanism:
    • Anonymization process: user nyms
      • In this paper: random permutation
  • LPPM: ()
slide12

LPPM

Attacker

modeling adversary
Modeling adversary
  • An adversary is characterized by
    • Knowledge
    • Attack
  • Knowledge
    • Training traces (incomplete and/or noisy)
    • Public information
    • Observable traces released by LPPM
  • Attack
    • Presence/absence disclosure attack
    • Meeting disclosure attack
  • See details in III.B, III.C

}

which metric for user privacy
Which metric for user privacy?
  • The answer of an attack is not deterministic, but probabilistic, due to the mechanism of LPPM.
  • Expecting a distribution as the result of attack, so we have:
    • Uncertainty of a distribution
    • Accuracy of each element on the distribution
    • Correctness: distance between the truth and attacker’s estimation.
which metric for user privacy1
Which metric for user privacy?
  • Uncertainty:
    • Given a distribution, each element seems so uniform!
      • Uncertainty is high / certainty is low.
    • Prob. distribution <0.33, 0.33, 0.33>
which metric for user privacy2
Which metric for user privacy?
  • Uncertainty
    • If someone is outstanding
      • Uncertainty is low / Certainty is high
    • Prob. distribution <0.1, 0.1, 0.8>
which metric for user privacy3
Which metric for user privacy?
  • Accuracy is quantified with confidence interval and confidence level.
which metric for user privacy4
Which metric for user privacy?
  • Correctness is what users really care.
    • User: I do not care your uncertainty.
      • Even if I am outstanding in your distribution (high certainty), you may not get me right. (E.g. I am @ starbucks while you say am @ McD)
    • User: I do not care your accuracy.
      • Even if your are accurate enough, you may still not get me right.
    • User: what I care is whether you get me right, i.e., correctness.
so far
So far
  • Framework for LPPM
    • Modeling LPPM ()
    • Modeling prior information Not covered
    • Modeling attacks Not covered
  • Right Metric
    • Certainty
    • Accuracy
    • Correctness
  • Implementing a tool: Location-Privacy Meter.
  • Assessing Entropy metric and k-anonymity.
implementing location privacy meter
Implementing Location-Privacy Meter
  • Modeling prior information
    • User’s mobility can be modeled as Markov Chain
    • So, we need transition matrix for MC.
    • The mobility profile is the transition matrix.
    • We need !
approaching statistics details
Approaching , statistics details
  • Gibbs sampling for
  • Iterative procedure
  • Dirichlet prior for each row in
  • Iterative procedure for ET
implementing location privacy meter1
Implementing Location-Privacy Meter
  • Modeling attack
    • Maximum likelihood attack
      • to find the jointly most likely traces for all users
    • Distribution tracking attack
      • Computes the distribution of traces for each user
attack algorithm details
Attack, algorithm details
  • Maximum Weight Assignment (MWA)
  • Hungarian algorithm
  • Viterbi algorithm
  • Metropolis Hastings (MH) algorithm
ad