1 / 17

Privacy Preserving In LBS

Privacy Preserving In LBS. Evaluating Privacy of LBS Algorithms In Dynamic Context. Outline. Introduction Design Model & Workflow System Design Specification : General Approach Build Privacy Case Based Database Conclusion & Future Work. Design Specification : General Approach.

eitan
Download Presentation

Privacy Preserving In LBS

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Privacy Preserving In LBS Evaluating Privacy of LBS Algorithms In Dynamic Context

  2. Outline • Introduction • Design Model & Workflow System • Design Specification : General Approach • Build Privacy Case Based Database • Conclusion & Future Work

  3. Design Specification : General Approach • Privacy Attack Models. • User … • Service Provider … • Context Provider …

  4. Privacy Attack ModelsContent • Introduction • Location Distribution Attack • Maximum Movement Boundary Attack • Query Tracking Attack

  5. Privacy Attack ModelsIntroduction • Privacy attacks are categorized by Privacy Attack Model (adversary model). • Attack models are differ by : • Collected target information. • Attacker ability in capturing message during service provisioning. • Attacker background knowledge.

  6. Privacy Attack ModelsIntroduction (cont.)

  7. Privacy Attack ModelsLocation Distribution Attack • Location Distribution Attack takes place when: • User locations are known • Some user have outlier locations. • The employed spatial cloaking algorithm tends to generate minimum areas. F E D C B A • Given a cloaked spatial region covering a sparse area (user A) and a partial dense area (user B, C, and D), an adversary can easily figure out that the query issuer is an outlier.

  8. Solution to Location Distribution Attackk-Sharing property • K-sharing Region Property: A cloaked spatial region not only contains at least k users, but it also is shared by at least k of these users. • The same cloaked spatial region is produced from k users. An adversary cannot link the region to an outlier. F E D C B A • Result in an overall more privacy-aware environment. • Example of technique that are free from this attack include CliqueCloak.

  9. Solution to Location Distribution AttackCliqueCloak algorithm • Each user requests: • A level of k anomity. • A constraint area. • Build an undirected constraint graph. Two nodes are linked, if their constraint areas contain each other. E (k=3) B (k=4) F (k=5) D (k=4) m (k=3) • For a new user m, add m to the graph. Find the set of nodes that are linked to m in the graph and has level of anonymity less than m.k. H (k=4) A (k=3) C (k=2) • The cloaked region is the MBR (cloaking box) that includes the user and the neighboring nodes. All users within an MBR use that MBR as their cloaked region.

  10. Building constraint graph G Finding a subset M of S s.t. m is in M, m.k = |M|, for each n in M n.k ≤ |M|, and M forms a clique in G. Building transformed messages from all messages in M CliqueCloak pseudo-code • while TRUE do • pick a message m from S. • N ← all messages in range B(m) • for each n in N do: • if P(m) is in B(n) then: add the edge (m,n) into G • M ← local_k_search(m.k, m, G) • if M ≠ Ø then • Bcl(M) ← The minimal area that contains M • for each n in M do • remove n from S • remove n from G • nT← < n.uid, n.rno, Bcl(M), n.C > • output transformed message nT • remove expired messages from S

  11. Find a group U of neighbors to m in G s.t. their anonymity value doesn’t exceed k. Remove members of U with less than k-2 neighbors, that cannot provide us with a (k-1)-clique Look for a k-clique inside U. CliqueCloak pseudo-code • local_k_search(k, m, G) • U ← { n | (m,n) is an edge in G and n.k ≤ k } • if |U| < k-1 then • return Ø • l ← 0 • while l ≠ |U| do • l ← |U| • for each u in U do • if |{G neighbors of u in U}| < k-2 then U ← U \ {u} • find any subset M in U s.t. |M| = k-1 and M U {m} forms a clique • return M U {m}

  12. CliqueCloakMessage specification • A plan message (from client to server) m consists of: • m.uid= Unique identifier of the sender • m.rno = Message’s reference number • P(m) = Message’s spatial point (e.g. the client’s current location). • B(m) = Message’s spatial constraint area • m.t = Message’s temporal constraint (expiration time) • m.C = Message’s content • m.k = Message’s anonymity level

  13. CliqueCloakMessage specification • A transformed message (from server to database)mT consists of: • m.uid , m.rno • Bcl(m) = Message’s spatial cloaking box • m.C

  14. CliqueCloakEvaluation • Pros: • Free from location distribution attack (query sampling attack). • Cons: • suffers from high computational cost as it can support only k-anomity up to k=10. • cost of searching a clique in a graph is costly. • some requests that cannot be anonymized will be dropped when their lifetimes expire.

  15. Attack Model PrivacyMessage attributes summary • A plain message sent from user must consist of 9 attributes: • Id : Unique identifier of the sender. • Ref : Message’s reference number. • P : Message’s spatial point (e.g. user current location). • B : Message’s spatial constraint area. • t : Message’s temporal constraint (expiration time). • v : velocity / maximum speed. • QoS: quality of service. • C : Message’s content. • k : Message’s anomity level.

  16. Conclusion & Future Work

  17. Reference • [1] • [2] • [3]

More Related