Quantifying Location Privacy - PowerPoint PPT Presentation

Quantifying location privacy
1 / 29

  • Uploaded on
  • Presentation posted in: General

Quantifying Location Privacy. Reza Shokri , George Theodorakopoulos , Jean-Yves Le Boudec and Jean-Pierre Hubaux. About. This paper was written in 2011 , published on IEEE S&P . It counts a significant part in Reza Shokri’s PhD thesis.

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.

Download Presentation

Quantifying Location Privacy

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript

Quantifying location privacy

Quantifying Location Privacy

Reza Shokri, George Theodorakopoulos, Jean-Yves Le Boudec and Jean-Pierre Hubaux



  • This paper was written in 2011, published on IEEE S&P.

  • It counts a significant part in Reza Shokri’s PhD thesis.

  • It is recognized as the runner-up for PET Award 2012.

  • Slides on Prezi: http://prezi.com/w5oogou4ypkl/quantifying-and-protecting-location-privacy/

In this paper

In this paper

  • A systematic framework to quantify location privacy protection mechanism (LPPM).

    • Modeling LPPM

    • Modeling prior information

    • Modeling attacks

  • Claiming correctness is the right metric for user’s privacy.

    • Certainty?

    • Accuracy?

    • Correctness?

  • Implementing a tool: Location-Privacy Meter.

  • Assessing the appropriateness of Entropy metric and k-anonymity.

What is location privacy

What is Location Privacy

  • Has been a concern with the rise of Location based services (LBS).

    • Examples of LBS:

    • Examples of concerns:

Location privacy is important

Location Privacy is IMPORTANT

  • A bunch of points on map?

  • A trace?

  • It reveals your habits, interests, relationship and more!

How to protect

How to protect

  • By LPPM (Location privacy protection mechanism).

    • Anonymization

    • Location/path obfuscation

    • Perturbation: 12 19

    • Dummy: 12 {12, 19, 4, 27}

    • Reducing precision: 14 Teens

    • Location hiding: 12

How to attack

How to attack

  • Where is s/he at this time?

  • Track him/her!

  • Who meets whom at a given place/time?

Presence/absence disclosure attack

Meeting disclosure attack

Big question how to model

Big question: How to model?

  • A framework

    : set of mobile users

    : set of possible actual traces

    LPPM: location privacy protection mechanism

    : set of observed traces

    ADV: adversary

    METRIC: evaluation metric

Elements of framework

Elements of Framework

Modeling users event and trace

Modeling Users: event and trace

  • User moves within Region at time T

  • U – R – T

  • An event is defined as a triple

  • A trace of user u is a T-size vector or event

  • : the set of all traces that may belong to user.

  • x x … x : the set of all possible traces of all possible users

Modeling lppm

Modeling LPPM

  • An LPPM receives a set of N actual traces, one for each user, and modifies them in two steps:

    • Obfuscation process: location nyms

      • An obfuscation mechanism:

    • Anonymization process: user nyms

      • In this paper: random permutation

  • LPPM: ()

Quantifying location privacy



Modeling adversary

Modeling adversary

  • An adversary is characterized by

    • Knowledge

    • Attack

  • Knowledge

    • Training traces (incomplete and/or noisy)

    • Public information

    • Observable traces released by LPPM

  • Attack

    • Presence/absence disclosure attack

    • Meeting disclosure attack

  • See details in III.B, III.C


Which metric for user privacy

Which metric for user privacy?

  • The answer of an attack is not deterministic, but probabilistic, due to the mechanism of LPPM.

  • Expecting a distribution as the result of attack, so we have:

    • Uncertainty of a distribution

    • Accuracy of each element on the distribution

    • Correctness: distance between the truth and attacker’s estimation.

Which metric for user privacy1

Which metric for user privacy?

  • Uncertainty:

    • Given a distribution, each element seems so uniform!

      • Uncertainty is high / certainty is low.

    • Prob. distribution <0.33, 0.33, 0.33>

Which metric for user privacy2

Which metric for user privacy?

  • Uncertainty

    • If someone is outstanding

      • Uncertainty is low / Certainty is high

    • Prob. distribution <0.1, 0.1, 0.8>

Which metric for user privacy3

Which metric for user privacy?

  • Accuracy is quantified with confidence interval and confidence level.

Which metric for user privacy4

Which metric for user privacy?

  • Correctness is what users really care.

    • User: I do not care your uncertainty.

      • Even if I am outstanding in your distribution (high certainty), you may not get me right. (E.g. I am @ starbucks while you say am @ McD)

    • User: I do not care your accuracy.

      • Even if your are accurate enough, you may still not get me right.

    • User: what I care is whether you get me right, i.e., correctness.

Uncertainty accuracy and correctness

Uncertainty, accuracy and correctness

So far

So far

  • Framework for LPPM

    • Modeling LPPM ()

    • Modeling prior information Not covered

    • Modeling attacksNot covered

  • Right Metric

    • Certainty

    • Accuracy

    • Correctness

  • Implementing a tool: Location-Privacy Meter.

  • Assessing Entropy metric and k-anonymity.

Implementing location privacy meter

Implementing Location-Privacy Meter

  • Modeling prior information

    • User’s mobility can be modeled as Markov Chain

    • So, we need transition matrix for MC.

    • The mobility profile is the transition matrix.

    • We need !



Approaching statistics details

Approaching , statistics details

  • Gibbs sampling for

  • Iterative procedure

  • Dirichlet prior for each row in

  • Iterative procedure for ET

Implementing location privacy meter1

Implementing Location-Privacy Meter

  • Modeling attack

    • Maximum likelihood attack

      • to find the jointly most likely traces for all users

    • Distribution tracking attack

      • Computes the distribution of traces for each user

Attack algorithm details

Attack, algorithm details

  • Maximum Weight Assignment (MWA)

  • Hungarian algorithm

  • Viterbi algorithm

  • Metropolis Hastings (MH) algorithm

Assessing entropy metric and k anonymity

Assessing Entropy metric and k-anonymity

Assessing entropy metric and k anonymity1

Assessing Entropy metric and k-anonymity

Quantifying location privacy


  • Login