1 / 76

Privacy-Enhancing Technologies (PETs)

Privacy-Enhancing Technologies (PETs). Simone Fischer-Hübner. Note: The “OPTIONAL” tags (for the CS 6910 students) and page numbers were added by L. Lilien. Overview. Introduction to PETs Anonymous Communication Technologies Anonymous eCash P3P (Platform for Privacy Preferences)

Download Presentation

Privacy-Enhancing Technologies (PETs)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Privacy-Enhancing Technologies (PETs) Simone Fischer-Hübner Note: The “OPTIONAL” tags (for the CS 6910 students) and page numbers were added by L. Lilien

  2. Overview • Introduction to PETs • Anonymous Communication Technologies • Anonymous eCash • P3P (Platform for Privacy Preferences) • Privacy-enhanced Idenity Management

  3. I. Introduction to PETsNeed for Privacy-Enhancing Technologies • Law alone is not sufficient for protecting privacy in our Network Society • PETs needed for implementing Law • PETs for empowering users to exercise their rights

  4. Classifications of PETs 1. PETs for minimizing/ avoiding personal data losses (-> Art. 6 I c., e. EU Directive 95/46/EC) (providing Anonymity, Pseudonymity, Unobservability, Unlinkability) • At communication level: • Mix nets, Onion Routing • DC nets • Crowds • At application level: • Anonymous Ecash • Anonymous Credentials 2. PETs for the safeguarding of lawful processing (-> Art. 17 EU Directive 95/46/EC) • P3P • Privacy policy languages • Encryption 3. Combination of 1 & 2 • Privacy-enhanced Identity Management

  5. Definitions - Anonymity • Anonymity: The state of being not identifiable within a set of subjects (e.g. set of senders or recipients), the anonymity set Source: Pfitzmann/Hansen

  6. Perfect sender/receiver anonymity • Perfect sender (receiver) anonymity: An attacker cannot distinguish the situations in which a potential sender (receiver) actually sent (received) a message or not.

  7. Definitions - Unobservability • Unobservability ensures that a user may use a resource or service without others being able to observe that the resource or service is being used Source: Pfitzmann/Hansen

  8. Definitions - Unlinkability • Unlinkability of two or more items (e.g., subjects, messages, events): • Within the system, from the attacker’s perspective, these items are no more or less related after the attacker’s observation than they were before • Unlinkability of sender and recipient (relationship anonymity): • It is untraceable who is communicating with whom

  9. Definitions - Pseudonymity • Pseudonymity is the use of pseudonyms as IDs • Pseudonymity allows to provide both privacy protection andaccountability Person pseudonym L I N K A B I L I T y Role pseudonym Relationship pseudonym Role-relationship pseudonym Transaction pseudonym Source: Pfitzmann/Hansen

  10. Definitions - Pseudonymity (cont.) Source: Pfitzmann/Hansen

  11. II. Anonymous Communication TechnologiesMix-nets (Chaum, 1981) Bob Alice A2, r1 A3, r2 Bob, r3, msg K3 K2 K1 msg Mix 3 Mix 1 A3, r2 Bob, r3, msg K3 K2 Bob, r3, msg K3 Mix 2 Ki: public key of Mixi, ri: random number, Ai: address of Mixi

  12. OPTIONAL Sufficient messages from many senders ? Functionality of a Mix Server (Mixi) M I X i Message DB Collect messages in batch or pool Discard repeated messages Output Message Mi+1 to Mixi+1 Input Message Mi Change outlook *) Reorder • *) decrypts Mi = ci[Ai+1, ri, Mi+1] with the private key ci of Mixi, • ignores random number ri, • obtains address Ai+1 and encrypted Mi+1

  13. OPTIONAL Mixi Mixi+1 ci(M, Ai+1 ) M Address(Mixi+1) = Ai+1 = ? ci (M, Ai+1) Why are random numbers needed ? If no random number ri is used :

  14. OPTIONAL Sender Anonymity with Mix nets

  15. OPTIONAL Sender Anonymity with Mix-nets (cont.)

  16. OPTIONAL Recipient Anonymity with Mix- nets

  17. OPTIONAL Recipient anonymity with Mix- nets (cont.)

  18. Two-Way Anonymous Conversation

  19. Protection properties & Attacker Model for Mix nets • Protection properties: • Sender anonymity against recipients • Unlinkability of sender and recipient • Attacker may: • Observe all communication lines • Send own messages • Delay messages • Operate Mix servers (all but one...) • Attacker cannot: • Break cryptographic operations • Attack the users personal machine

  20. Attacks & Countermeasures Passive attacks: Correlation by content : -> all message to / from Mix should be encrypted and have to include random string Correlation by message length : -> uniform message length (through padding) Time correlation : -> Output batch: accumulate N messages, forward them in random order -> Pool: If (N+1)th message arrives, forward one message from the pool -> Combination of batch + pool -> Interval batching: Fill batch/pool with dummy messages at end of time interval T -> random delay

  21. OPTIONAL Attacks & Countermeasures Active attacks: Isolate & Identify ( (n-1)-attack): -> Dummy messages -> Check sender ID Message replay attacks: -> Discard replays -> Intermix detours -> charge of Ecash Intersection/partitioning attacks: -> Mix cascades (use always the same sequence of Mixes)

  22. Mix- Applications:Anonymous remailers • Sender anonymity against recipients • Servers that strip off identifying information from emails and forward them to receiver • Simple remailers (one ”Mix”) are ”single points of trust” with no protection against time/content correlation attacks • Some use encryption, can be chained and work like mixes (e.g., Mixmaster) Sender Remailer Recipient

  23. OPTIONAL Existing Mix-based systems for HTTP (real-time) • Simple Proxies • Anonymizer.com • ProxyMate.com • Mix-based Systems considering traffic analysis: • Onion Routing (Naval Research Center) • TOR (Free Haven project) • JAP (TU Dresden)

  24. OPTIONAL Anonymising Proxies – Anonymizer.com Functionality: • Web proxy (single ”Mix”) that forwards request on the user’s behalf • Does not forward IP address of end user • Eliminates infos about user’s machine (e.g., previously visited sites) • Filters out cookies, JavaScript, active content Limitations: • Single point of trust • Connection itself is not anonymised

  25. Onion Routing • Onion = Object with layers of public key encryption to produce anonymous bi-directional virtual circuit between communication partners and to distribute symmetric keys • Initiator's proxy constructs “forward onion” which encapsulates a route to the responder • (Faster) symmetric encryption for data communication via the circuit U X Z Y Z X Y Z Y Z

  26. OPTIONAL – see 6030 slides X exp-timex, Y, Ffx, Kfx, Fbx, Kbx Y exp-timey, Z, Ffy, Kfy, Fby, Kby, Z exp_timez, NULL, Ffz, Kfz, Fbz, Kbz, PADDING Forward Onion for route W-X-Y-Z: Each node N receives (PKN = public key of node N): • {exp-time, next-hop, Ff, Kf, Fb, Kb, payload} PKN • exp-time: expiration time • next_hop: next routing node • (Ff, Kf) : function / key pair for symmetric encryption of data moving forward in thevirtual circuit • (Fb, Kb) : function/key pair for symmetric encryption of data moving backwards in the virtual circuit • payload: another onion (or null for responder´s proxy)

  27. OPTIONAL Example: Virtual Circuit with Onion Routing

  28. OPTIONAL Onion Routing - Review • Functionality: • Hiding of routing information in connection oriented communication relations • Nested public key encryption for building up virtual circuit • Expiration_time field reduces costs of replay detection • Dummy traffic between Mixes (Onion Routers) • Limitations: • First/Last-Hop Attacks by • Timing correlations • Message length (No. of cells sent over circuit)

  29. Crowds for anonymous Web-Transactions • User first joins a "crowd" of other users, where he is represented by a "jondo" process on his local machine • User configures his browser to employ the local jondo as a proxy for all new services • User´s request is passed by the jondo to a random member of the crowd • That member can either submit the request directly to the web server or forward it to another randomly (with pf> 1/2) chosen user. -> Request is eventually submitted by a random member [jondo – derived from “John Doe”], an epitome for an anonymous

  30. OPTIONAL Communications with Crowds 1 6 3 5 5 1 2 6 2 3 4 4 Communications between jondos is encrypted with keys shared between jondos

  31. Anonymity degrees in Crowds

  32. OPTIONAL Anonymity Properties in Crowds

  33. Crowds -Review • Sender anonymity against: • end web servers • other Crowd members • eavesdroppers • Limitations: • No protection against “global” attackers, timing/message length correlation attacks • Web server´s log may record submitting jondo´s IP address as the request originator´s address • Request contents are exposed to jondos on the path • Anonymising service can be circumvented by Java Applets, Active X controls • Performance overhead (increased retrieval time, network traffic and load on jondo machines) • No defend against DoS-attacks by malicious crowd members

  34. OPTIONAL DC (Dining Cryptographers) nets [Chaum 1988 ]

  35. OPTIONAL DC-nets: Perfect sender anonymity through Binary superposed sending and broadcast

  36. OPTIONAL Anonymity preserving multi-access protocols

  37. OPTIONAL Anonymity preserving multi-access protocols (cont.)

  38. OPTIONAL Implementation-Example: Local-Area Ring Networks

  39. OPTIONAL DC nets - Review • Protection properties: • Perfect sender anonymity through superposed sending (message bits are hidden by one-time pad encryption) • Message secrecy through encryption • Recipient anonymity through broadcast and implicit addresses (addressee is user who can successfully decrypt message) • Problems: • Denial of Service attacks by DC-net participants (Defense: trap protocols) • Random key string distribution

  40. III. Anonymous Ecash based on Blind SignaturesProtocolOverview

  41. Protocol steps for creating and spending untraceable Ecash Customer (Alice): • generates a note number (100-digit number) at random • in essence multiplies it by a blinding (random) factor • signs the blinded number with a private key and sends it to the bank Bank: • verifies and removes Alice´s signature • debits Alice´s account (by $1) • signs blinded note with a digital signature indicating its $1-value and sends it to Alice Customer (Alice): • divides out the blinding factor • uses bank notes (transfers it to shop) Merchant (Bob): • verifies bank´s digital signature • transmits note to bank Bank: • verifies its signature • checks the note against a list of those already spent • credits Bob´s account • sends signed ”deposit slip” to Bob Merchant (Bob): • hands the merchandise to Alice together with his own signed receipt

  42. OPTIONAL Mathematical protocol for issuing and spending untraceable money [Chaum 1987] (e,n) : bank´s public key, (d,n): bank´s private key 1. Alice chooses at random x and r, and supplies the bank with B= re * f(x) (mod n) where: x: serial number of bank note, r: blinding factor, f: one-way function 2. The bank returns Bd (mod n) = (re f(x) )d (mod n) = r * f(x) d (mod n) and withdraws one dollar from her account 3. Alice extracts C = Bd / r (mod n) = f(x)d (mod n) from B 4. To pay Bob one dollar, Alice gives him the pair (x, f(x)d(mod n)) 5. Bob immediately calls the bank, verifying that this note has not already been deposited The Bank and the shop do not know the blinding factor, i.e. they cannot relate the banknote to Alice -> Alice can shop anonymously

  43. OPTIONAL Why is one-way function f needed? • Suppose (x, xd mod n) is electronic money • Money can be forged: • choose y • exhibit (ye mod n, y) • To forge money of the form (x, f(x)d mod n), you have to produce (f-1(ye) mod n, y).

  44. OPTIONAL Blind Signatures and Perfect Crime (e.g. blackmail) [von Solms et al. 1992]: • Open bank account, create blinded notes, send mail with threat announcement and blinded notes • Let the bank first sign the blinded notes and then publish them (e.g. in a newspaper) • Divide out blinding factors to create digital money (only the blackmailer knows the blinding factors) • Note: Conditions are worse as in usual kidnapping cases: • Police cannot register serial number of bank notes • No physical contact needed (to transfer blackmailed money)

  45. Developed by the World Wide Web Consortium (W3C) Final P3P 1.0 Recommendation issued 16 April 2002 Allows web sites to communicate about their privacy policies in a standard computer-readable format Does not require web sites to change their server software Enables the development of tools (built into browsers or separate applications) that Summarize privacy policies Compare privacy policies with user preferences Alert and advise users P3P helps users understand privacy policies P3P increases transparency, but it does not set baseline standards or enforce policies P3P user agent software Microsoft Internet Explorer 6 Netscape Navigator 7 AT&T Privacy Birdhttp://privacybird.com/ For more information http://www.w3.org/P3P/ http://p3ptoolbox.org/ Web Privacy with P3Pby Lorrie Faith Cranorhttp://p3pbook.com/ IV. Platform for Privacy Preferences Project (P3P) - Overview Source: Lorrie Cranor, lorrie.cranor.org

  46. Basic components • P3P provides a standard XML format that web sites use to encode their privacy policies • Sites also provide XML “policy reference files” to indicate which policy applies to which part of the site (usually at ”well known location” /w3c/p3p.xml) • Sites can optionally provide a ”compact policy” by configuring their servers to issue a special P3P header when cookies are set • “P3P user agent” fetch and read P3P policies and can • inform users about the site’s P3P privacy practices, and/or • compare P3P policies with privacy preferences (in XML) set by users and take appropriate actions Source: Lorrie Cranor, lorrie.cranor.org

  47. GET /index.html HTTP/1.1 Host: www.att.com . . . Request web page HTTP/1.1 200 OK Content-Type: text/html . . . Send web page A simple HTTP transaction WebServer Source: Lorrie Cranor, lorrie.cranor.org

  48. GET /w3c/p3p.xml HTTP/1.1 Host: www.att.com Request Policy Reference File Send Policy Reference File Request P3P Policy Send P3P Policy GET /index.html HTTP/1.1 Host: www.att.com . . . Request web page HTTP/1.1 200 OK Content-Type: text/html . . . Send web page … with P3P 1.0 added WebServer Source: Lorrie Cranor, lorrie.cranor.org

  49. P3P increases transparency http://www.att.com/accessatt/ • P3P clients can check a privacy policy each time it changes • P3P clients can check privacy policies on all objects in a web page, including ads and invisible images http://adforce.imgis.com/?adlink|2|68523|1|146|ADFORCE Source: Lorrie Cranor, lorrie.cranor.org

  50. P3P in [MS] IE6 Automatic processing of compact policies only; third-party cookies without compact policies blocked by default Privacy icon on status bar indicates that a cookie has been blocked – pop-up appears the first time the privacy icon appears Source: Lorrie Cranor, lorrie.cranor.org

More Related