1 / 42

Anonymizing User Location and Profile Information for Privacy-aware Mobile Services

Anonymizing User Location and Profile Information for Privacy-aware Mobile Services. Masanori Mano , Yoshiharu Ishikawa Nagoya University. Outline. Background & Motivation Related Work System Framework Matching Degree Algorithm Experimental Evaluation Conclusions and Future work.

prince
Download Presentation

Anonymizing User Location and Profile Information for Privacy-aware Mobile Services

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Anonymizing User Location and Profile Information for Privacy-aware Mobile Services Masanori Mano, Yoshiharu Ishikawa Nagoya University

  2. Outline • Background & Motivation • Related Work • System Framework • Matching Degree • Algorithm • Experimental Evaluation • Conclusions and Future work

  3. Background & Motivation

  4. Location-Based Services (LBSs) Where is the nearest café?

  5. Profile-Based LBSs • LBSs typically utilize user locations and map information • Finding nearby restaurants • Presenting a map around the user • Computing the best route to the destination • Use of user profiles (user’s property) can improve the quality of service • Property- and location-based services • Application areas • Mobile shopping • Mobile advertisements

  6. Example: Mobile Advertisements • Provides local ads to mobile users • Example: Announcement of time-limited sales of nearby shops • Use of user profiles • Properties: age, sex, address, marital status, etc. • Send selected ads to appropriate person • Example: {sex: F, age: 28, has_kids: yes} • Cosmetics for women: good • Computers: maybe • Cosmetics for men: bad • Toys for kids: good Alice

  7. Example: Mobile Advertisements Alice came to a shopping mall Mobile Ads Provider Shopping Mall Alice

  8. Example: Mobile Advertisements Alice wanted ads Mobile Ads Provider Shopping Mall Alice

  9. Example: Mobile Advertisements Anonymizer construct a cloaked region and send property Mobile Ads Provider Request with(sex: F, age: 28, …) Cloaked Region

  10. Example: Mobile Advertisements Ads provider returns selected ads for Alice Mobile Ads Provider Alice

  11. Example: Mobile Advertisements Security Camera But, Alice is the only female within the region Mobile Ads Provider Cloaked Region

  12. Example: Mobile Advertisements Security Camera If an adversary obtains information, he can detect target user Mobile Ads Provider Get information Identify Adversary

  13. Example Security Camera In this anonymization,the adversary can’t identify the user Mobile Ads Provider Can’tIdentify Adversary

  14. related work

  15. Related Work (1) • Techniques for location anonymity are classified into two extreme types [Ling Liu, 2009] • Anonymous location services: Only consider user locations • Identity-driven location services: Also consider user identities • Our method lies between the two extremes, but considers user properties • Another dimension

  16. Related Work (2) • k-anonymity is the most popular approach in the proposals for location anonymity • User’s location is indistinguishable from locations of at least other k -1 users • Our approach is also based on the concept of k-anonymity • Extended by considering user properties

  17. Related Work (3) • Various approaches to anonymous location services • Casper [Mokbel+06]: The anonymizer utilize a grid-based pyramid data structure like quad-tree • PrivacyGrid [Bamba+08]: Computes cloaked region by dynamic cell expansion • XStar [Wang+09]: Intended for the problem for automobiles on road networks

  18. SYSTEM Framework

  19. System Architecture (1) • There is a service called Matchmaker between users and ads providers • Roles of Matchmaker • Maintains user & ad profiles • Matchmaking: Recommend good ads for a given ads request • Anonymization of locations and user properties Ad Matchmaker Ads Provider Ad User Ad User Ad Ads Provider User Ad

  20. System Architecture (2) • Matchmaker is a trusted third-party server • Given an ad request, Matchmakersends anonymized request to ads providers • Use of the user’s profile/location and ad profiles • Even if some providers are untrusted, the user’s privacy is protected anonymized data raw data Matchmaker Ads provider User trusted route

  21. User Profile • Represents the user’s properties • k : minimum population • A cloaked region should contain at least k users • l : minimum length • Minimum length of each side of a cloaked region (square) • s : distance threshold • The user wants ads within this distance • Additional attributes (e.g., age and sex) • Value ranges are specified s kusers l

  22. Advertisement Profile • Represents properties of each advertisement • An advertisement that satisfies the following conditions should be sent • The ad area overlaps withthe user’s requesting area • Other properties (age and sex)match (overlap) the user’s properties Ad1 s Ad2

  23. MATCHING DEGREE

  24. Motivation: Bad Anonymization • The cloaked region contains aged/young and male/female users • The properties of the region is vague • The ads provider has a cosmetic ad for female • The ads provider may have a question: Is it valuable to send the ad? Ads provider Age: young to agedSex: * (all) ?

  25. Motivating Example: Good Anonymization • Good anonymization would be that the users in the cloaked region have similar properties to the target user • Matching degree is introduced as a similarity Bad Anonymization Good Anonymization different sex different age similar sex and age

  26. Matching Degree • A matching degree is computed as the overlapped area of attribute values • Range: [0, 1] • Treated as if it were a probability value Attribute Values of Target User Overlapped Area Attribute Values of Other User Matching Degree for Spatial Attributes Matching Degree for Interval Attributes

  27. Matching Degree Attribute of target user Target user is Dave Compared user is Alice match = 0.0 Target user is Bob Compared user is Alice match = 1.0 Target user is Alice Compared user is Bob match = 0.5

  28. Anonymization ALGORITHM

  29. Anonymity Conditions • The cloaked region contains the target user • The region contains at least k – 1 other users • The length of each side of the region is longer than l • The matching degrees between the target user and k - 1 users are more than a certain threshold value k-1 users target user l

  30. Anonymization Process • Consider a rectangular region centered target user • Randomly select one user as a seed from the users within the region • Compute a rectangle around the seed • If the rectangle contains at least k users with good matching degrees, anonymization is completed A F Q B E C D

  31. Anonymization Example • Alice required ad • k = 3 • Threshold for matching degree = 0.5 Alice Joe Mike Kent Dave Mary

  32. Anonymization Example • Alice is young woman • match = 1.0 • Mary is also young woman • match = 1.0 • Kent is young man • match = 0.5 • Joe is aged man • match = 0.0 • Dave and Mike are middle age men • match = 0.2 Alice Joe Mike 1.0 0.0 Kent 0.2 0.5 Dave Mary 1.0 0.2

  33. Anonymization Example • A region centered Alice contains Kent and Mike • We assume that Kent is selected as the seed user Alice Joe Mike 1.0 0.0 Kent 0.2 0.5 Dave Mary 1.0 0.2

  34. Anonymization Example • Compute region around Kent • Check whether anonymization is appropriate Alice Joe Mike 1.0 0.0 Kent 0.2 0.5 Dave Mary 1.0 0.2

  35. Anonymization Example • Cloaked region contains three users with good matching degrees • We can’t detect target user • Alice, Kent and Mary are young person • It is good anonymization target user is young person Alice Joe Mike 1.0 0.0 Kent 0.2 0.5 Dave Mary 1.0 0.2

  36. EXPERIMENTAL EVALUATION

  37. Experimental Evaluation • CPU 2.8GHz • RAM 512MB • Linux • Evaluation on synthetic data Experimental Settings

  38. Threshold Values and Success Rates • Matchmaker specifies a threshold value of matching degree • Find out an appropriate threshold • Success rate is sensitive to population • Need to change threshold flexibly Containing more than or equal to k users with good matching degree (i.e. ≧threshold) is successful anonymization

  39. Computation Time • We compare computation times of two approaches • Compute matching degrees • Does not compute matching degrees • Only consider the number of users • Computing of matching degrees takes more than twice times • We’ll try to improve algorithms of computing matching degrees

  40. CONCLUSIONS & FUTURE WORK

  41. Conclusions and Future work Conclusions • Proposed an approach to anonymization for LBSs • Utilizing user profiles to specify users’ properties and anonymization preferences • Property-aware anonymization using matching degrees Future work • More experimental evaluation • Improving algorithm

  42. Thank you

More Related