1 / 28

Location Privacy in Pervasive Computing

Location Privacy in Pervasive Computing. Alastair R. Beresford Frank Stajano University of Cambridge. Presented by Arcadiy Kantor — CS4440 September 13, 2007. About Me. Fifth-year CS major Originally from Moscow, Russia, more recently from Alpharetta, GA CS2200 Teaching Assistant

cvilla
Download Presentation

Location Privacy in Pervasive Computing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Location Privacy in Pervasive Computing Alastair R. Beresford Frank Stajano University of Cambridge Presented by Arcadiy Kantor — CS4440 September 13, 2007

  2. About Me • Fifth-year CS major • Originally from Moscow, Russia, more recently from Alpharetta, GA • CS2200 Teaching Assistant • Opinions Editor, Technique • Highly involved in AIESEC

  3. Historical Perspective • Fourth Amendment to U.S. Constitution proclaims a right to privacy. • 1948-Universal Declaration of Human Rights • “Everyone has a right to privacy at home, with family, and in correspondence.” • Privacy on the internet and based upon new technologies is an ongoing issue. • One of the issues created by new technology is location privacy.

  4. Location Privacy • The ability to prevent other parties from learning one’s current or past location. • The need is a recent development. • Pervasive computing applications may require certain location information.

  5. The Objective • To protect the privacy of our location information while taking advantage of location-aware services.

  6. Striking the Balance • Location-based applications fall into three categories: • Applications that cannot work without the user’s identity. • Applications that can function completely anonymously. • Applications that cannot be accessed anonymously, but do not require the user’s true identity to function.

  7. Anonymizing Identity: Model • While you trust the service provider and middleware, you do not trust any of the applications. • Therefore, you use the middleware to provide frequently-changing pseudonyms to the applications. • Purpose: Not to establish reputation, but to provide a “return address.”

  8. The Problem with Pseudonyms • Systems with high resolution • Spatial • Temporal • Can link old and new pseudonyms to one another.

  9. Mix Zones • Mix network • Store-and forward network used to anonymize communication. • Hostile observers who can monitor all the links in the network cannot match up the sender and the receiver of a message. • Mix zones apply this concept to locations.

  10. A sample mix zone with three application zones. As you enter a mix zone, you are assigned a new pseudonym. The application no longer knows which user is which until you leave the mix zone with a new pseudonym.

  11. Potential Problems • A mix zone’s security strongly depends on the number of users in it. • If you are the only person in the mix zone, it provides zero anonymity. • Users moving in a direction are much more likely to continue moving the same way. • If two application zones are closer to one another than a third, the time of travel through the mix zone can reveal a user’s identity.

  12. Measuring Effectiveness • Two measures • Anonymity set (instant and average values) • Entropy

  13. Anonymity Sets • The group of people visiting a given mix zone at the same time as the user. • A rough determination of the level of privacy. • i.e. a user may not wish to provide location updates to an application unless the anonymity set size is >= 20 people. • Average anonymity set size for current and neighboring mix zones can be used to estimate overall level of location privacy.

  14. Experimental Data • Used installation of Active Bat system at AT&T Labs Cambridge. • Each user carries a small “bat” device that provides location updates. • System can locate bats with less than 3cm error up to 95 percent of the time. • Typical update rate: 1-10 times per second. • Approximately 3.4 million samples taken over two weeks used for data.

  15. Three mix zones defined in a laboratory. Z1: first-floor hallway Z2: first-floor hallway and main corridor Z3: hallway, main corridor, stairwell on all floors.

  16. Anonymity set size for mix zone Z1 Needed an 8-minute update period to provide anonymity set size of 2.

  17. Anonymity set size for mix zone Z3 Needed only a 15-second update period to reach anonymity set size of 2. Much better, but still has issues.

  18. Reviewing the Data • Level of privacy provided in experiment is rather low. • High resolution of tracking system • Low user population • May be significantly more effective for tracking systems based on locating cell phones via towers they use.

  19. Entropy in User Movement • The anonymity set’s size is only a good measure of anonymity when all the members of the set are equally likely to be the one of interest to an observer. • i.e., an observer cannot narrow down the set of users by identifying patterns and trends. • Maximum entropy.

  20. Maximum Entropy: Not Possible • A user moving in a given direction is likely to keep moving in the same direction. • Suppose you define p as the user’s preceding location (location at time t-1) and s as the subsequent location (location at time t+1). • Can create a movement matrix to calculate the probabilities of movement from one zone to another.

  21. The movement matrix M Each element represents the frequency of movements from the preceding zone, p, at time t-1, to the subsequent zone, s, at time t+1.

  22. Calculating probabilities • Conditional probability of coming out through zone s given that you have gone in through zone p: • Then the entropy can be calculated:

  23. Practical Usefulness • Using the same set of results and the aforementioned formulas, one can calculate the probability of a person’s actions when they enter a zone.

  24. An Example • Suppose two people move into a zone, coming from opposite directions. • Options for actions: • Each continues moving in the same direction. • Each turns around. • One turns around, other keeps moving the same way. • One can calculate the probability of both users doing a U-turn. • Using the statistics in M, the probability of both doing a U-turn is 0.1 percent, while the probability of both going straight is 99.9 percent.

  25. Conclusion • The entropy in the aforementioned example is 0.012 bits. • Maximum entropy is a value of 1 bit. • When a hostile observer is able to observe the behavior of users over time the anonymity granted by mix zones and other anonymization methods greatly decreases.

  26. Why Does It Matter? • Half the battle is knowing how private and secure your information is. • Better methods of measuring location privacy allow users to make sound decisions about private data sharing.

  27. Directions for Future Research • Managing application use of pseudonyms. • Reacting to insufficient anonymity. • Improving the models. • Dummy users. • Granularity. • Scalability.

  28. Questions? Note: the link to this paper on the reading list is broken. Rather, you may download the full paper here: http://www.cl.cam.ac.uk/~fms27/papers/2003-BeresfordSta-location.pdf

More Related