1 / 50

Privacy Preserving In LBS

Privacy Preserving In LBS. Evaluating Privacy of LBS Algorithms In Dynamic Context. Outline. Introduction Design Model & Workflow System Design Specification : General Approach Build Privacy Case Based Database Conclusion & Future Work. Introduction(1). What is Context ? [1]

prince
Download Presentation

Privacy Preserving In LBS

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Privacy Preserving In LBS Evaluating Privacy of LBS Algorithms In Dynamic Context

  2. Outline • Introduction • Design Model & Workflow System • Design Specification : General Approach • Build Privacy Case Based Database • Conclusion & Future Work

  3. Introduction(1) • What is Context ? [1] • Any information that can be used to characterize the situation of entities (whether a person, place or object) are considered relevant to the interaction between (include themselves) • User • Application

  4. Introduction(2) • Problem of Privacy Preserving in Dynamic Context ? • Different services require different algorithms. • Even in the only one service? • How to evaluate privacy algorithms in dynamic context ?

  5. Outline • Introduction • Design Model & Workflow System • Design Model System • Workflow System • Design Specification : General Approach • Build Privacy Case Based Database • Conclusion & Future Work

  6. Design Model System(1)

  7. Evaluation Module[2][3]

  8. Workflow System

  9. Outline • Introduction • Design Model & Workflow System • Design Specification : General Approach • Build Privacy Case Based Database • Conclusion & Future Work

  10. Design Specification : General Approach • Introduction to Privacy Attack Models • Location Distribution Attack • Maximum Movement Boundary Attack • Query Tracking Attack • Message attributes summary

  11. Introduction to Privacy Attack Models • Privacy attacks are categorized by Privacy Attack Model (adversary model). • Attack models are differ by : • Collected target information. • Attacker ability in capturing message during service provisioning. • Attacker background knowledge.

  12. Privacy Attack ModelsIntroduction (cont.)

  13. Privacy Attack ModelsContent • Introduction • Location Distribution Attack • Maximum Movement Boundary Attack • Query Tracking Attack • Message attributes summary

  14. Privacy Attack ModelsLocation Distribution Attack • Location Distribution Attack takes place when: • User locations are known • Some user have outlier locations. • The employed spatial cloaking algorithm tends to generate minimum areas. F E D C B A • Given a cloaked spatial region covering a sparse area (user A) and a partial dense area (user B, C, and D), an adversary can easily figure out that the query issuer is an outlier.

  15. Solution to Location Distribution Attackk-Sharing property • K-sharing Region Property: A cloaked spatial region not only contains at least k users, but it also is shared by at least k of these users. • The same cloaked spatial region is produced from k users. An adversary cannot link the region to an outlier. F E D C B A • Result in an overall more privacy-aware environment. • Example of technique that are free from this attack include CliqueCloak.

  16. Solution to Location Distribution AttackCliqueCloak algorithm • Each user requests: • A level of k anomity. • A constraint area. • Build an undirected constraint graph. Two nodes are linked, if their constraint areas contain each other. E (k=3) B (k=4) F (k=5) D (k=4) m (k=3) • For a new user m, add m to the graph. Find the set of nodes that are linked to m in the graph and has level of anonymity less than m.k. H (k=4) A (k=3) C (k=2) • The cloaked region is the MBR (cloaking box)that includes the user and the neighboring nodes. All users within an MBR use that MBR as their cloaked region.

  17. Building constraint graph G Finding a subset M of S s.t. m is in M, m.k = |M|, for each n in M n.k ≤ |M|, and M forms a clique in G. Building transformed messages from all messages in M Solution to Location Distribution AttackCliqueCloak pseudo-code • while TRUE do • pick a message m from S. • N ← all messages in range B(m) • for each n in N do: • if P(m) is in B(n) then: add the edge (m,n) into G • M ← local_k_search(m.k, m, G) • if M ≠ Ø then • Bcl(M) ← The minimal area that contains M • for each n in M do • remove n from S • remove n from G • nT← < n.uid, n.rno, Bcl(M), n.C > • output transformed message nT • remove expired messages from S

  18. Find a group U of neighbors to m in G s.t. their anonymity value doesn’t exceed k. Remove members of U with less than k-2 neighbors, that cannot provide us with a (k-1)-clique Look for a k-clique inside U. Solution to Location Distribution AttackCliqueCloak pseudo-code(cont.) • local_k_search(k, m, G) • U ← { n | (m,n) is an edge in G and n.k ≤ k } • if |U| < k-1 then • return Ø • l ← 0 • while l ≠ |U| do • l ← |U| • for each u in U do • if |{G neighbors of u in U}| < k-2 then U ← U \ {u} • find any subset M in U s.t. |M| = k-1 and M U {m} forms a clique • return M U {m}

  19. Solution to Location Distribution AttackCliqueCloak message specification • A plan message (from client to server) m consists of: • m.uid= Unique identifier of the sender • m.rno = Message’s reference number • P(m) = Message’s spatial point (e.g. the client’s current location). • B(m) = Message’s spatial constraint area • m.t = Message’s temporal constraint (expiration time) • m.C = Message’s content • m.k = Message’s anonymity level

  20. Solution to Location Distribution AttackCliqueCloak message specification(cont.) • A transformed message (from server to database)mT consists of: • m.uid , m.rno • Bcl(m) = Message’s spatial cloaking box • m.C

  21. Solution to Location Distribution AttackEvaluation of CliqueCloak • Pros: • Free from location distribution attack (query sampling attack). • Cons: • suffers from high computational cost as it can support only k-anomity up to k=10. • cost of searching a clique in a graph is costly. • some requests that cannot be anonymized will be dropped when their lifetimes expire.

  22. Privacy Attack ModelsContent • Introduction • Location Distribution Attack • Maximum Movement Boundary Attack • Query Tracking Attack • Message attributes summary

  23. Privacy Attack ModelsMaximum Movement Boundary Attack • Maximum movement boundary attack takes place when: • Continuous location updates or continuous queries are considered • The same pseudonym is used for two consecutive updates • The maximum possible speed is known • The maximum speed is used to get a maximum movement boundary (MBB) • The user is located at the intersection of MBB with the new cloaked region I know you are here! Ri+1 Ri

  24. Ri+1 Ri+1 Ri+1 Ri Ri Ri Solution to Maximum Movement Boundary Attack Safe Update Property[4] • Two consecutive cloaked regionsRi and Ri+1 from the same users are free from the maximum movement boundary attack if one of these three conditions hold: • The MBB of Ri totally covers Ri+1 • The overlapping area satisfies user requirements • Ri totally covers Ri+1 The MMB of Ritotally covers Ri+1

  25. Patching: Combine the current cloaked spatial region with the previous one Delaying: Postpone the update until the MMB covers the current cloaked spatial region Solution to Maximum Movement Boundary Attack Patching and Delaying[4][9] Ri+1 Ri+1 Ri Ri

  26. Solution to Maximum Movement Boundary Attack Using ICliqueCloak Algorithm[10] • Main idea Incrementally maintain maximal cliques for location cloaking in an un-directed graph that takes into consideration the effect of continuous location updates. Use a graph model to formulate the problem • Each mobile user is represented by a node in the graph • An edge exists between two nodes/users only if they are within the MMB of each other and can be potentially cloaked together

  27. Solution to Maximum Movement Boundary Attack Using ICliqueCloak Algorithm[10] • Graph Modeling • Let G(V, E) be an undirected graph where V is the set of nodes/users who submitted location-based query requests, and E is the set of edges. • There exists an edge evwbetween two nodes/users v and w, if and only if

  28. Solution to Maximum Movement Boundary Attack Using ICliqueCloak Algorithm[10] • Algorithm • maximal clique as a clique that is not contained in any other clique. • start with a graph without any edges • All nodes themselves constitute a set of 1-node cliques. • Then add the edges to the graph one by one and incrementally update the set of maximal cliques. • the cliques where the user of the new request is involved might be candidate cloaking sets, classified to three classes: • positive candidates • negative candidates • not candidates

  29. Solution to Maximum Movement Boundary Attack Using ICliqueCloak Algorithm[10] • Performance

  30. Maximum Movement Boundary AttackAtributes • Location • Time • Maximum velocity • Privacy level k • User-tolerant maximum area Amax

  31. Privacy Attack ModelsContent • Introduction • Location Distribution Attack • Maximum Movement Boundary Attack • QueryTracking Attack • Message attributes summary

  32. Query attacks • K-anonymity: • Interval Cloak, CliqueCloak, Uncertainty Cloaking,… • Query attacks: • Query sampling attacks • Query homogeneity attacks • Query tracking attacks

  33. Query homogeneity attacks[12]

  34. F G H D E A C I B J K Query tracking attacks[4] • This attack takes place when: • Continuous location updates or continuous queries are considered • The same pseudonym is used for several consecutive updates • User locations are known • Once a query is issued, all users in the query region are candidates to be the query issuer • If the query is reported again, the intersection of the candidates between the query instances reduces the user privacy At time ti {A,B,C,D,E} At time ti+1{A,B,F,G,H} At time ti+2 {A,F,G,H,I}

  35. Solutions • Memorizing • m-Invariance • Historical k-Anonymity • …

  36. F G H D E A C I B J K Memorizing[4] • Remember a set of users S that is contained in the cloaked spatial region when the query is initially registered with the database server • Adjust the subsequent cloaked spatial regions to contain at least k of these users. • If a user sis not contained in a subsequent cloaked spatial region, this user is immediately removed from S. • This may result in a very large cloaked spatial region. At some point, the server may decide to disconnect the query and restart it with a new identity.

  37. Query m-Invariance[11][13] • Query l-diversity: ensures that a user cannot be linked to less than ℓ distinct service attribute values.

  38. Query m-Invariance(cont) • Satisfying location k-anonymity • Satisfying query ℓ-diversity query 3-diverse and location 3-anonymous

  39. Query m-Invariance(cont) • Query m-Invariance: the number of possible query association attacks will increase if a user can be associated with more number of service attribute values.

  40. Attributes • A plain message sent from user: • Id: Unique identifier of the sender. • Ref: Message’s reference number. • P: Message’s spatial point (e.g. user current location). • C: Message’s content. • k: Message’s anonymity level. • ℓ: Message’s diversity level. • m: Message’s invariance level.

  41. Privacy Attack ModelsContent • Introduction • Location Distribution Attack • Maximum Movement Boundary Attack • Query Tracking Attack • Message attributes summary

  42. Attack Model privacyMessage attributes summary • A plain message sent from user must consist of 11 attributes: • Id: Unique identifier of the sender. • Ref: Message’s reference number. • P: Message’s spatial point (e.g. user current location). • B: Message’s spatial constraint area. • t: Message’s temporal constraint (expiration time). • v: velocity / maximum speed. • QoS: quality of service. • C: Message’s content. • k: Message’s anonymity level. • ℓ: Message’s diversity level. • m: Message’s invariance level.

  43. Outline • Introduction • Design Model & Workflow System • Design Specification : General Approach • Build Privacy Case Based Database • Conclusion & Future Work

  44. Build Privacy Case Based Database • From attack model and attributes we found, a case will include: • Input attributes • Graph • Algorithm using to protect privacy • Specification • Define a interval for each attribute • Define some properties which input must satisfy them • Note • To reduce the computation, we just calculate on subgraph which is related to the query issuer. • Database will delete queries which is expired

  45. Outline • Introduction • Design Model & Workflow System • Design Specification : General Approach • Build Privacy Case Based Database • Conclusion & Future Work

  46. Conclusion • Evaluating privacy algorithms in dynamic context need a flexible technique • Calculation case-based • Ontology reasoner • Attack Models are core component of calculation case-based

  47. Future Work • Continue on case-base specification • Research other attack models • Study on User, CP, SP. • Select appropriate structure for case-base data. • Tree structure: parent node present a more general case. • Specify Ontology Reasoner.

  48. Reference • [1]. Anind K. Dey and Gregory D. Abowd. Towards a Better Understanding of Context and Context-Awareness. In Graphics, Visualization and Usability Center and College of Computing, Georgia Tech, Atlanta, GA USA, 30332-0280, 2000. • [2]. Yonnim Lee, Ohbyung Kwon: An index-based privacy preserving service trigger in context-aware computing environments. In Expert Systems with Applications 37, p.5192–5200, 2010. • [3]. Claudio Bettini, Linda Pareschi, Daniele Riboni: Efficient profile aggregation and policy evaluation in a middleware for adaptive mobile applications. In: Pervasive and mobile computing. ISSN 1574-1192, p. 697-718, 2008 Oct. • [4] Mohamed F. Mokbel. Privacy in Location-based Services: State-of-the-art and Research Directions. 2007 International Conference on Mobile Data Management. • [5] B. Gedik and L.Liu. A customizable k-Anonymity Model for Protecting Location Privacy. Proc. IEEE Int'l Conf. Distributed Computing Systems (ICDCS '05), pp. 620-629, 2005. • [6] B. Gedik and L.Liu. Location Privacy in Mobile Systems: A Personalized Anonymization Model. In ICDCS, 2005.

  49. Reference • [7] Z. Xiao, X. Meng and J. Xu. Quality Aware Privacy Protection for Location-based Services. Proc. the 12th Int. Conf. on Database Systems for Advanced Applications (DASFAA '07), Bangkok, Thailand, April 2007. • [8] Chi-Yin Chow and Mohamed F. Mokbel. Enable Private Continuous Queries For Revealed User Locations. Proc. Int'l Symp. Spatial and Temporal Databases (SSTD), 2007. • [9] . Reynold Cheng, Yu Zhang, Elisa Bertino, and Sunil Prabhakar. Preserving User Location Privacy in Mobile Data Management Infrastructures. In Proceedings of Privacy Enhancing Technology Workshop, PET, 2006. • [10]. X.Pan, J.Xu, and X.Meng. Protecting Location Privacy against Location-Dependent Attack in Mobile Services. Conference on Information and Knowledge Management. Proceeding of the 17th ACM conference on Information and knowledge management 2008,  Napa Valley, California, USA    October 26 - 30, 2008. • [11] RinkuDewri, Indrakshi Ray, Indrajit Ray and Darrell Whitley. Query m-Invariance: Preventing Query Disclosures in Continuous Location-Based Services., 11th International Conference on Mobile Data Management, MDM 2010, Kansas City, Missouri, USA, May 23-26, 2010

  50. Reference • [12] Fuyu Liu, Kien A. Hua, Ying Cai. Query l-diversity in Location-Based Services, in Proceedings of the 10th International Conference on Mobile Data Management: Systems, Services and Middleware, 2009, pp. 436–442. • [13] PanosKalnis, Gabriel Ghinita, KyriakosMouratidis, and DimitrisPapadias. Preventing Location-Based Identity Inference in Anonymous Spatial Queries, IEEE Transactions on Knowledge and Data Engineering, vol. 19, no. 12, pp. 1719-1733, Aug. 2007

More Related