1 / 64

Topics in Computer Security: Introduction: PETs and TETs

Topics in Computer Security: Introduction: PETs and TETs. Simone Fischer-Hübner. Overview. Introduction to Privacy Introduction to PETs Transparency Enhancing Tools Anonymous Communication Technologies & TOR Private Information Retrieval. I. Introduction to Privacy: Privacy Dimensions.

gari
Download Presentation

Topics in Computer Security: Introduction: PETs and TETs

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Topics in Computer Security: Introduction: PETs and TETs Simone Fischer-Hübner

  2. Overview • Introduction to Privacy • Introduction to PETs • Transparency Enhancing Tools • Anonymous Communication Technologies & TOR • Private Information Retrieval

  3. I. Introduction to Privacy:Privacy Dimensions • Informational self-determination • Spatial privacy

  4. Basic Privacy principles (implemented in EU-Directive 95/46/EC) • Legitimisation by law, informed consent (Art. 7 EU Directive) • Dataminimisation and avoidance (Art. 6 I c, Art. 7) • Purpose specification and purpose binding (Art. 6 I b) • ”Non-sensitive” data do not exist !

  5. Example for Purpose Misuse • Lidl Video Monitoring Scandal

  6. Basic privacy principles (II) • Transparency, rights of data subjects • Supervision (Art. 28) and Sanctions (Art.24) • Requirement of security mechanisms (Art.17)

  7. EU Directive 2002/58/EC on privacy and electronic communications • Location data other than Traffic data (Art.9): • May only be processed when made anonymous, or with the informed consent of the user/subscriber • Where consent has been obtained, the user/subscriber must still have possibility of temporarily refusing the processing of location data

  8. Privacy Challenges of Emerging Technologies... • Global networks, cookies, webbugs, spyware,... • LBS • Ambient Intelligence, RFID • Biometrics...

  9. Privacy Risks of Social Networks - Facebook • Intimate personal details about social contacts, personal life, etc. • Not only accessible by ”friends” • The Internet never forgets completely....

  10. Privacy Risks of Social Networks – Facebook (II)

  11. Privacy Risks of Social Networks – Facebook Beacons

  12. II. Introduction to PETsNeed for Privacy-Enhancing Technologies • Law alone is not sufficient for protecting privacy in our Network Society • PETs needed for implementing Law • PETs for empowering users to exercise their rights

  13. Classifications of PETs 1. PETs for minimizing/ avoiding personal data (-> Art. 6 I c., e. EU Directive 95/46/EC) (providing Anonymity, Pseudonymity, Unobservability, Unlinkability) • At communication level: • Mix nets • Onion Routing, TOR • DC nets • Crowds • At application level: • Anonymous Ecash • Private Information Retrieval • Anonymous Credentials 2. PETs for the safeguarding of lawful processing (-> Art. 17 EU Directive 95/46/EC) • P3P • Privacy policy languages • Transparency Enhancing Tools (TETs) 3. Combination of 1 & 2 • Privacy-enhanced Identity Management

  14. III. Transparency Enhancing ToolsDirective 95/46/EC - Transparency • Art. 6: personal data must be processed fairly and lawfully • Recital No. 38: data subject must be given accurate and full information • Art. 10/11: Controller must provide Information • the identity of the controller / representative • the purposes of the processing • any further information (recipients, replies obligatory or voluntary, consequences of failure to reply, existence of the right of access / to rectify the data) • Art. 12 (a): Right of access (get information from controller about e.g. data processing, purpose, recipients, etc.). • Art 12 (b): Right to rectification, blocking deletion. • Art. 14: ensure that data subjects are aware of the existence of the right to object e.g. data processing for direct marketing

  15. Flash Eurobarometer 2003 Survey • 37% of companies said they systematically provide data subjects with the identity of the data controller • 46% said they always informed data subjects of the purpose for which the data would be used • 42% of EU citizens are aware that those collecting personal information are obliged to provide individuals with certain information (such as at least their identity and the purpose of the data collection)

  16. Transparency Enhancing Tools:Example: “Data Track” in PRIME • Transparency: ”Data Track” providing: • User side-DB with user-friendly search function for transaction records (incl. data, pseudonyms, credentials, timestamp, policy) • Online-Functions for exercising rights Advanced search

  17. Online Functions for Exercising Rights • Problem: Users do not know their privacy rights and do not exercise them • Can Online Functions help to overcome this threshold and raise trust ?

  18. Issues to be addressed at the user side • Authentication for digital identity – not straight forward if pseudonyms were used • Access request should not reveal more than known by service provider

  19. Issues to be addressed by service side • Service side automated response support needed • Laws might need to be updated to allow Online requests (e.g. in Sweden the PUL only provides the right to access data once in a year) • Service side transparency and accountability tools need to be privacy-enhanced

  20. Provides full Transparency, but could also be used as a perfect profiling tool Example: E-Government Transparency Service: MyPage/MinSide

  21. Accountability vs. Privacy • For transparency of data use/accountability: ”Policy-aware” transaction logs needed, which however contain personal data about users and data subjects • Appropriate protection schems for logs needed (access control, pseudonymisation,...)

  22. IV. Anonymous Communication TechnologiesDefinitions - Anonymity • Anonymity: The state of being not identifiable within a set of subjects (e.g. set of senders or recipients), the anonymity set Source: Pfitzmann/Hansen

  23. Definitions - Unobservability • Unobservability ensures that a user may use a resource or service without others being able to observe that the resource or service is being used Source: Pfitzmann/Hansen

  24. Definitions - Unlinkability • Unlinkability of two or more items (e.g., subjects, messages, events): • Within the system, from the attacker’s perspective, these items are no more or less related after the attacker’s observation than they were before • Unlinkability of sender and recipient (relationship anonymity): • It is untraceable who is communicating with whom

  25. Definitions - Pseudonymity • Pseudonymity is the use of pseudonyms as IDs • Pseudonymity allows to provide both privacy protection andaccountability Person pseudonym L I N K A B I L I T y Role pseudonym Relationship pseudonym Role-relationship pseudonym Transaction pseudonym Source: Pfitzmann/Hansen

  26. Definitions - Pseudonymity (cont.) Source: Pfitzmann/Hansen

  27. Mix-nets (Chaum, 1981) Bob Alice A2, r1 A3, r2 Bob, r3, msg K3 K2 K1 msg Mix 3 Mix 1 A3, r2 Bob, r3, msg K3 K2 Bob, r3, msg K3 Mix 2 Ki: public key of Mixi, ri: random number, Ai: address of Mixi

  28. Sufficient messages from many senders ? Functionality of a Mix Server (Mixi) M I X i Message DB Collect messages in batch or pool Discard repeated messages Output Message Mi+1 to Mixi+1 Input Message Mi Change outlook *) Reorder • *) decrypts Mi = EKi[Ai+1, ri, Mi+1] with the private key of Mixi, • ignores random number ri, • obtains address Ai+1 and encrypted Mi+1

  29. Mixi Mixi+1 E Ki(M, Ai+1 ) M Address(Mixi+1) = Ai+1 = ? E Ki (M, Ai+1) Why are random numbers needed ? If no random number ri is used :

  30. Sender Anonymity with Mix-nets • Sender (Alice) chooses Mix-Sequence Mix1, ….., Mixn, Mixn+1 • Mixn+1 = recipient • Ai (i =1..n+1): address of Mixi • Ki (i=1..n+1): public key of Mixi • zi: random bit strings • M: message for recipient • Mi: message that Mixi will receive • Sender prepares her message: • Mn+1 = EKn+1 (M) • Mi = EKi (zi, Ai+1, Mi+1) for i=1…n • and sends M1 to Mix1

  31. Sender Anonymity with Mix-nets (cont.) Recipient (Bob) Sender (Alice) Mix1 Mix2 Mix3 Ek1(z1, A2, M2) Ekn+1(M) Each Mixi decrypts: EKi(zi, Ai+1, Mi+1) -> Ai+1: address of next Mix Mi+1: EKi+1(zi+1, Ai+2, Mi+2), encoded message for Mixi+1 zi: random string, to be discarded and forwards Mi+1 to Mixi+1

  32. Recipient Anonymity with Mix- nets • Recipient Bob chooses Mix-Sequence Mix1, ….., Mixm • and creates anonymous return address RA: • Rm+1 = e • Rj = Ekj(cj, Aj+1, Rj+1) for j=1..m • RA = (c0, A1, R1) • e : label of return address • cj: symmetric key, used by Mixj to encode message on the return path • Aj (j =1..m): address of Mixj • kj (j=1..m): public key of Mixj • zj: random bit strings • Recipient Bob sends RA anonymously to Sender Alice: Ekm(zm, Am-1,Ekm-1(…EK1(z1,A0,RA)..)) RA Mix1 Mix2 Mixm Bob Sender Alice

  33. Recipient anonymity with Mix- nets (cont.) Sender Alice replies: Bob Mix1 Mix2 Mix3 cm(cm-1(…c0(M)…)),e c0(M), R1 Each Mixj receives: cj-1(…c0(M)..), Rj, decrypts: Rj = Ekj(cj, Aj+1, Rj+1) -> (cj, Aj+1, Rj+1), forwards: cj(cj-1(…c0(M)…)), Rj+1 to Mixj+1 Label e indicates Bob which c0,..,cm he has to use to decrypt M

  34. Two-Way Anonymous Conversation

  35. Existing Mix-based systems for HTTP (real-time) • Simple Proxies • Anonymizer.com • ProxyMate.com • Mix-based Systems considering traffic analysis: • Onion Routing (Naval Research Center) • TOR (Free Haven project) • JAP (TU Dresden, ”Mix Cascade”)

  36. Onion Routing • Onion = Object with layers of public key encryption to produce anonymous bi-directional virtual circuit between communication partners and to distribute symmetric keys • Initiator's proxy constructs “forward onion” which encapsulates a route to the responder • (Faster) symmetric encryption for data communication via the circuit U X Z Y Z X Y Z Y Z

  37. X exp-timex, Y, Ffx, Kfx, Fbx, Kbx Y exp-timey, Z, Ffy, Kfy, Fby, Kby, Z exp_timez, NULL, Ffz, Kfz, Fbz, Kbz, PADDING Forward Onion for route W-X-Y-Z: Each node N receives (PKN = public key of node N): • {exp-time, next-hop, Ff, Kf, Fb, Kb, payload} PKN • exp-time: expiration time • next_hop: next routing node • (Ff, Kf) : function / key pair for symmetric encryption of data moving forward in thevirtual circuit • (Fb, Kb) : function/key pair for symmetric encryption of data moving backwards in the virtual circuit • payload: another onion (or null for responder´s proxy)

  38. Onion Routing- Building up virtual circuit Create command accompanied by Onion: • If node receives onion, it peels off one layer, keeps forward/backward encryption keys, it chooses a virtual circuit (vc) identifier and sends create command+ vc identifier + (rest of) onion to next hop. • It stores the vc identifier it receives and the one that it sent out as a pair. • Until circuit is destroyed -> whenever it receives data on one connection, it sends it off to the other • Forward encryption is applied to data moving in the forward direction, backward encryption is applied in the backward direction

  39. Example: Virtual Circuit with Onion Routing Send data by the use of send command: Data sent by the initiator is ”pre-encrypted” prepeatedly by his proxy. If W received data sent back by last Z, it applies the inverse of the backward cryptographic operations (outermost first).

  40. Onion Routing - Review • Functionality: • Hiding of routing information in connection oriented communication relations • Nested public key encryption for building up virtual circuit • Expiration_time field reduces costs of replay detection • Dummy traffic between Mixes (Onion Routers) • Limitations: • First/Last-Hop Attacks by • Timing correlations • Message length (No. of cells sent over circuit)

  41. TOR (2nd Generation Onion Router)

  42. First Step • TOR client obtains a list of TOR nodes from a directory server • Directory servers maintain list of which onion routers are up, their locations, current keys, exit policies, etc. TOR client Directory server

  43. TOR circuit setup • Client proxy establishes key + circuit with Onion Router 1 TOR client

  44. TOR circuit setup • Client proxy establishes key + circuit with Onion Router 1 • Proxy tunnels through that circuit to extend to Onion Router 2 TOR client proxy

  45. TOR circuit setup • Client proxy establishes key + circuit with Onion Router 1 • Proxy tunnels through that circuit to extend to Onion Router 2 • Etc. TOR client proxy

  46. TOR circuit setup • Client proxy establishes key + circuit with Onion Router 1 • Proxy tunnels through that circuit to extend to Onion Router 2 • Etc. • Client applications connect and communicate over TOR circuit TOR client proxy

  47. TOR circuit setup • Client proxy establishes key + circuit with Onion Router 1 • Proxy tunnels through that circuit to extend to Onion Router 2 • Etc. • Client applications connect and communicate over TOR circuit TOR client proxy

  48. TOR circuit setup • Client proxy establishes key + circuit with Onion Router 1 • Proxy tunnels through that circuit to extend to Onion Router 2 • Etc. • Client applications connect and communicate over TOR circuit TOR client proxy

  49. TOR circuit setup • Client proxy establishes key + circuit with Onion Router 1 • Proxy tunnels through that circuit to extend to Onion Router 2 • Etc. • Client applications connect and communicate over TOR circuit TOR client proxy

More Related