Anonymity background
This presentation is the property of its rightful owner.
Sponsored Links
1 / 25

Anonymity - Background PowerPoint PPT Presentation


  • 70 Views
  • Uploaded on
  • Presentation posted in: General

Anonymity - Background. Fall 2009 TUR 2303 M 5 (11:45-12:35), R 5-6 (11:45-1:40) Prof. Newman, instructor CSE-E346352-392-1488 Office Hours (tentative): MTW 2-3pm [email protected] - subject: Anon. Outline. Course Outline – what is this subject? Projects and papers Policies. Exercise.

Download Presentation

Anonymity - Background

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Anonymity background

Anonymity - Background

  • Fall 2009

  • TUR 2303

  • M 5 (11:45-12:35), R 5-6 (11:45-1:40)

  • Prof. Newman, instructor

  • CSE-E346352-392-1488

  • Office Hours (tentative): MTW 2-3pm

  • [email protected] - subject: Anon ...


Outline

Outline

  • Course Outline – what is this subject?

  • Projects and papers

  • Policies


Exercise

Exercise

  • Take 2 minutes to think about anonymity.

  • Answer these questions in writing:

    • What is anonymity?

    • How is it related to privacy?

    • Give examples of need for anonymity (aiming at volume here)

  • Get into groups of 2-3 and share your answers

    • Try to arrive at a joint definition or agree to disagree

    • Add to your list of examples

  • Share your responses with the class


What is anonymity

What is Anonymity

  • Literally, lacking a name (a + onyma)

  • Unidentifiability

  • Inability to attribute artifact or actions

  • Related to privacy - how?


What is privacy

What is Privacy?

  • Ability of an entity to control its own space

    • Physical space

    • Bodily space

    • Data space

    • Communication space

    • What else?


Exercise1

Exercise

  • What are examples of privacy in these spaces?

    • Physical space

    • Bodily space

    • Data space

    • Communication space

  • What other spaces can you think of?


Privacy spaces

Privacy Spaces

  • Physical space: invasion, paparazzi, location

  • Bodily space: medical consent, battery

  • Data space: identity, activity, status, records

  • Communication space: email, Internet privacy, correspondents, phone #, address

  • Overlap in spaces (e.g., location)


Need for privacy anonymity

Need for Privacy/Anonymity

  • Planning/execution in competition

  • Fundamental right – voting, celebrities

  • Philosophical necessity (free will)

  • Restarting when past can cripple

  • Statutory requirements

  • Liability issues – data release

  • Freedom/survival in repressive environments

  • Increasing pressure from technologies


Privacy anonymity threats

Privacy/Anonymity Threats

  • Available surveillance technology

  • Identification technology

  • Increasing use of databases

  • Data mining

  • Identity theft

  • Increasing requirements for I&A

  • Increasing governmental desire for surveillance


Surveillance

Surveillance

  • 1.5 million CCTV cameras installed in UK post 911 – Londoner on camera ~300 times a day http://epic.org/privacy/surveillance/

  • Face recognition software used in Tampa for Superbowl

  • 5000 public surveillance cameras known in DC

  • Home and work zipcodes give identity in 5% of cases in US http://33bits.org/tag/anonymity/


Data reidentification

Data Reidentification

  • Even ”scrubbed” data can be re-identified

  • Characteristics within the data (e.g., word usage in documents)

  • Intersection attacks on k-anonymized database set releases

  • Use of known outside data in combination with released data

  • Data mining – higher dimensional space gives greater specificity!


Limitations on anonymity

Limitations on Anonymity

  • Accountability

  • Legal/criminal issues

  • Social expectations

  • Competing need for trust

  • Others?


Forms of anonymity

Forms of Anonymity

  • Traffic Analysis Prevention

  • Sender, Recipient, Message Anonymity

  • Voter Anonymity

  • Pseudonymity

  • Revokable anonymity

  • Data anonymity


Anonymity mechanisms

Anonymity Mechanisms

  • Cryptography

  • Steganography

  • Traffic Analysis Prevention (TAP)

  • Mixes, crowds

  • Data sanitization/scrubbing

  • k-anonymity


Adversaries

Adversaries

  • Global vs. Restricted

    • All links vs. some links

    • All network nodes vs. some or no nodes

  • Passive vs. Active

    • Passive – listen only

    • Active – remove, modify, replay, or inject new messages

  • Cryptography Assumptions

    • All unencrypted contents are observable

    • All encrypted contents are not, without key


Public key cryptography

Public Key Cryptography

  • Two keys, K and K-1, associated with entity A

  • K is public key, K-1 is private key

  • Keys are inverses: {{M}K}K-1 = {{M}K-1}K = M

  • For message M, ciphertext C = {M}K

    • Anyone can send A ciphertext using K

    • Only A has K-1 so only A can decrypt C

  • For message M, signature S = {M}K-1

    • Anyone can verify M,S using K

    • Only A can sign with K-1


Details we omit

Details we omit

  • Limit on size of M, based on size of K

  • Need to format M to avoid attacks on PKC

  • Use confounder to foil guessed ptxt attacks

  • Typical use of one-way hash H to distill large M to reasonable size for signing

  • Typical use of PKC to distribute symmetric key for actual encryption/decryption of larger messages

  • See http://www.rsa.com/rsalabs/ for standards


Chaum untraceable mail

Chaum – Untraceable Mail

  • Wish to receive email anonymously, but

    • Be able to link new messages with past ones

    • Respond to the sender

  • Do not trust single authority (e.g., Paypal)

  • Underlying message delivery system is untrusted

    • Global active adversary


Chaum mix 1

Chaum Mix 1

  • Mix is like a special type of router/gateway

  • It has its own public key pair, K1 and K1-1

  • Recipient A also has public key pair, Ka and Ka-1

  • Sender B prepends random confounder Ra to message M, encrypts for A: Ca = {Ra|M}Ka

  • B then prepends confounder for mix to C and encrypts for mix: C1 = {R1|A|Ca}K1

  • B sends C1 to mix, which later send Ca to A


Chaum mix 2

Chaum Mix 2

  • Mix simply decrypts and strips confounder from message to A

  • Incoming message and outgoing message do not appear related

  • Use padding to ensure same length (some technical details here)

  • Gather a batch of messages from different sources before sending them out in permuted order


Chaum mix

Chaum Mix

  • As long as messages are not repeated, adversary can't link an incoming message with an outgoing one (anonymous within the batch)

    • Mix can discard duplicate messages

    • B can insert different confounder in repeats

    • B can use timestamps – repeats look different

  • Mix signs message batchs, sends receipt to senders

    • This allows B to prove to A if a message was not forwarded


Cascading mixes 1

Cascading Mixes 1

  • If one mix is good, lots of mixes are better!

  • B prepares M for A by selecting sequence of mixes, 1, 2, 3, … , n.

    • Message for A is prepared for Mix 1

    • Message for Mix 1 is prepared for Mix 2

    • … Message for Mix n-1 is prepared for Mix n

    • Layered message is sent to Mix n

  • Each mix removes its confounder, obtains address of next mix (or A), and forwards when batch is sent in permuted order


Cascading mixes 2

Cascading Mixes 2

  • Mix in cascade that fails to forward a message can be detected as before (the preceding mix gets the signed receipt)

  • Any mix in cascade that is not compromised can provide unlinkability

  • This gets us anonymous message delivery, but does not allow return messages


Return addresses 1

Return Addresses 1

  • B generates a public key Kb for the message

  • B seals its true address and another key K using the mix's key K1: RetAddr = {K,B}K1, Kb

  • A sends reply M to mix along with return address: Reply = {K,B}K1, {R0|M}Kb

  • Mix decrypts address and key, uses key K to re-encrypt reply: {{R0|M}Kb}K and send to B


Return addresses 2

Return Addresses 2

  • B must generate a new return address for each message (K and Kb) so there are no duplicates

  • Mix must remove duplicates if found

  • Symmetric cryptography may be used for both K and Kb here (but not for mix key!)

  • Can cascade return messages by building the return address in reverse order, then peeling off layers as the reply is forwarded (and encrypted) along the return path


  • Login