Security and privacy in cloud computing
This presentation is the property of its rightful owner.
Sponsored Links
1 / 19

Security and Privacy in Cloud Computing PowerPoint PPT Presentation


  • 192 Views
  • Uploaded on
  • Presentation posted in: General

Security and Privacy in Cloud Computing. Ragib Hasan Johns Hopkins University en.600.412 Spring 2011. Lecture 8 04/04/2011. Enforcing Data Privacy in Cloud. Goal : Examine techniques for ensuring data privacy in computations outsourced to a cloud Review Assignment #7: (Due 4/11)

Download Presentation

Security and Privacy in Cloud Computing

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Security and privacy in cloud computing

Security and Privacy in Cloud Computing

Ragib HasanJohns Hopkins Universityen.600.412 Spring 2011

Lecture 8

04/04/2011


Enforcing data privacy in cloud

Enforcing Data Privacy in Cloud

Goal: Examine techniques for ensuring data privacy in computations outsourced to a cloud

Review Assignment #7: (Due 4/11)

Roy et al., Airavat: Security and Privacy for MapReduce, NSDI 2010

en.600.412 Spring 2011 Lecture 8 | JHU | Ragib Hasan


Recap cloud forensics bread butter paper from asiaccs 2010

Recap: Cloud Forensics (Bread & Butter paper from ASIACCS 2010)

Strengths?

Weaknesses?

Ideas?

en.600.412 Spring 2011 Lecture 8 | JHU | Ragib Hasan


What does privacy mean

What does privacy mean?

Information Privacy is the interest an individual has in controlling, or at least significantly influencing, the handling of data about themselves.

Confidentiality is the legal duty of individuals who come into the possession of information about others, especially in the course of particular kinds of relationships with them.

en.600.412 Spring 2011 Lecture 8 | JHU | Ragib Hasan


Problem of making large datasets public

Problem of making large datasets public

Model:

  • One party owns the dataset

  • Another party wants to run some computations on it

  • A third party may take data from the first party, run functions (from the second party) on the data, and provide the results to the second party

    Problem:

  • How can the data provider ensure the confidentiality and privacy of their sensitive data?

en.600.412 Spring 2011 Lecture 8 | JHU | Ragib Hasan


Problem of making large datasets public1

Problem of making large datasets public

  • Massachusetts Insurance Database

    • DB was anonymized, with only birthdate, sex, and zip code made available to public

    • Latanya Sweeny of CMU took the DB and voter records, and pinpointed the MA Governor’s record

  • Netflix Prize Database

    • DB was anonymized, with user names replaced with random IDs

    • Narayanan et al. used Netflix DB and imDB data to de-anonymize users

en.600.412 Spring 2011 Lecture 8 | JHU | Ragib Hasan


Differential privacy schemes can ensure privacy of statistical queries

Differential Privacy schemes can ensure privacy of statistical queries

Differential privacy aims to provide means to maximize the accuracy of queries from statistical databases while minimizing the chances of identifying its records.

Informally, given the output of a computation or a query, an attacker cannot tell whether any particular value was in the input data set.

en.600.412 Spring 2011 Lecture 8 | JHU | Ragib Hasan


Securing mapreduce for privacy and confidentiality

Securing MapReduce for Privacy and Confidentiality

  • Paper:

    • Roy et al., Airavat: Security and Privacy for MapReduce

    • Goal: Secure MapReduce to provide confidentiality and privacy assurances for sensitive data

en.600.412 Spring 2011 Lecture 8 | JHU | Ragib Hasan


System model

System Model

Data providers: own data sets

Computation provider: provides MapReduce code

Airavat Framework: Cloud provider where the MapReduce code is run on uploaded data

en.600.412 Spring 2011 Lecture 8 | JHU | Ragib Hasan


Threat model

Threat Model

  • Assets: Sensitive data or outputs

  • Attacker model:

    • Cloud provider (where Airavat is Run) is trustworthy

    • Computation provider (user who queries, provides Mapper and Reducer functions) can be malicious

      • Functions provided by the Computation provider can be malicious.

      • Cloud provider does not perform code analysis on user-generated functions

    • Data provider is trustworthy

en.600.412 Spring 2011 Lecture 8 | JHU | Ragib Hasan


Mapreduce

MapReduce

MapReduce is a widely used and deployed distributed computation model

Input data is divided into chunks

Mapper nodes run a mapping function on a chunk and output a set of <key, value> pairs

Reducer nodes combine values related to a particular key based on a function, and output to a file

en.600.412 Spring 2011 Lecture 8 | JHU | Ragib Hasan


Key design concepts

Key design concepts

Goal: Ensure privacy of source data

Concept used: Differential privacy – ensure that no sensitive data is leaked.

Method used: Adds randomLaplacian noise to outputs

en.600.412 Spring 2011 Lecture 8 | JHU | Ragib Hasan


Key design concepts1

Key design concepts

  • Goal: Prevent malicious users from preparing sensitive functions that leak data.

  • Concept used: Functional sensitivity - How much the output changes when a single element is included/removed from inputs

    • More sensitivity: more informationis leaked

  • How is used? :

    • Airavatrequires CPs to give range of possible output values.

    • This is used to determine sensitivity of CP-written mapperfunctions.

en.600.412 Spring 2011 Lecture 8 | JHU | Ragib Hasan


Key design concepts2

Key design concepts

  • Goal: Prevent users from sending many brute force queries and try to reveal the input data.

  • Concept used: Privacy budget (defined by data provider)

  • How used:

    • Data sources set privacy budget for data.

    • Each time a query is run, the budget is decreased, and

    • Once the budget is used up, user cannot run more queries.

en.600.412 Spring 2011 Lecture 8 | JHU | Ragib Hasan


Airavat system design

Airavatsystem design

  • Mappers are provided by computation provider, and hence are not trusted

  • Reducers are provided by Airavat. They are trusted

    • Airavat only supports a small set of reducers.

  • Keys must be pre-declared by CP (why?)

  • Airavat generates enough noise to assure differential privacy of values

  • Range enforcers ensure that output values from mappers lie within declared range

en.600.412 Spring 2011 Lecture 8 | JHU | Ragib Hasan


Security via mandatory access control

Security via Mandatory Access Control

In MAC, Operating System enforces access control at each access

Access control rights cannot be overridden by users

Airavat uses SELinux – a special Linux distribution that supports MAC (developed by NSA)

en.600.412 Spring 2011 Lecture 8 | JHU | Ragib Hasan


Security via mac

Security via MAC

Each data object and process is tagged showing the trust level of the object

Data providers can set a declassify bit for their data, in which case the result will be released when there is no differential privacy violation

en.600.412 Spring 2011 Lecture 8 | JHU | Ragib Hasan


Implementation

Implementation

Airavat was implemented on Hadoop and Hadoop FS.

en.600.412 Spring 2011 Lecture 8 | JHU | Ragib Hasan


Further reading

Further reading

Cynthia Dwork defines Differential Privacy, interesting blog post that gives high level view of differential privacy.

http://www.ethanzuckerman.com/blog/2010/09/29/cynthia-dwork-defines-differential-privacy/

en.600.412 Spring 2011 Lecture 8 | JHU | Ragib Hasan


  • Login