on protecting private information in social networks a proposal l.
Skip this Video
Download Presentation
On Protecting Private Information in Social Networks: A Proposal

Loading in 2 Seconds...

play fullscreen
1 / 16

On Protecting Private Information in Social Networks: A Proposal - PowerPoint PPT Presentation

  • Uploaded on

On Protecting Private Information in Social Networks: A Proposal. Bo Luo 1 and Dongwon Lee 2 1 The University of Kansas, bluo@ku.edu 2 The Pennsylvania State University, dongwon@psu.edu. Motivation. Online social networks

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
Download Presentation

PowerPoint Slideshow about 'On Protecting Private Information in Social Networks: A Proposal' - Thomas

Download Now An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
on protecting private information in social networks a proposal

On Protecting Private Information in Social Networks: A Proposal

Bo Luo1 and Dongwon Lee2

1 The University of Kansas, bluo@ku.edu

2 The Pennsylvania State University, dongwon@psu.edu

  • Online social networks
    • Getting very popular (e.g. Facebook: 68M unique visitors, 1.2B visits)
    • Various types of communities
      • General (e.g. Facebook; MySpace)
      • Business/professional (e.g. LinkedIn)
      • Alumni
      • Leisure
      • Healthcare (e.g. SoberCircle; PatientsLikeMe)
  • People socialize with friends
  • But also adversaries!
  • Privacy vulnerabilities in online social networks
    • Huge amount of personal information available over various types of social network sites.
    • Users are not fully aware of the risks.
    • Adversaries use various techniques to collect such information.
      • E.g. information retrieval and search engine
  • News stories
    • Facebook Stalkers [Dubow, USA Today, 2007]
    • Gadgets and add-ons read user profiles [Irvin, USA Today, 2008]
    • How Not to Lose Face on Facebook, for Professors. [Young, Chronicle, 2009]
    • .
privacy vulnerabilities
Privacy vulnerabilities
  • Threat 1: out-of-context information disclosure
    • Users present information to a “context” (e.g. targeted readers)
    • Implicit assumption
      • Information stays in the context
      • This is wrong!
    • Out-of-context information disclosure
      • Wrong configuration
      • Mal-functioning code
      • Users’ misunderstanding
  • Examples
    • Adversaries could simply register for forums to access many information.
    • Messages in a “closed” email-based community is archived and accessible to everyone.
    • Gadgets and add-ons read user profiles
privacy vulnerabilities5
Privacy vulnerabilities
  • Threat 2: In-network information aggregation
    • User share information in social networks
    • Implicit assumption: “a small piece of personal information is not a big deal”
    • Adversaries collect all the pieces of information associated with a user.
    • Adversaries aggregate all the information pieces.
    • Significant amount of privacy!
    • In-network information aggregation attack.
privacy vulnerabilities6
Privacy vulnerabilities
  • Threat 3: cross-network information aggregation
    • User participates in multiple networks
    • Different levels of privacy concerns.
    • Adversaries use evidences to link profiles from different SN sites
      • Attribute
      • Neighborhood
      • Similar posts
      • Propagation
    • Adversaries collects all the private information across multiple SN sites
    • Cross-network information aggregation
goals and solutions at a glance
Goals and solutions at a glance
  • Goal: prevent users from unwanted information disclosure, especially from the three threats.
  • Users should be able to socialize.
    • We cannot prevent users from sharing information
  • Honest-but-curious observer
    • Honest: no phishing, no spam, no hacking
    • Curious: very aggressive in seeking information
      • Registers for social networks
      • Uses search engines
      • Manipulates information
  • Our goal:
    • Protect users from honest-but-curious observers
design goals
Design goals
  • Enable users to describe a privacy plan——How they allow their private information items to be disclosed
    • Solution: privacy models
  • Alert users when they share information over social networks
    • Solution: passive monitor
  • Monitor private information over various social networks to make sure that they are not violated
    • Solution: active monitor
online social networks
Online social networks
  • We define two properties to describe online social networks
    • Openness level
      • How information in a social network could be accessed
      • E.g. OL=public– everyone can access;
      • E.g. OL=registration-required– all registered users can access, but not search engines.
    • Access equivalency group
      • Social networks with identical openness level belongs to a group.
private information model
Private information model
  • We define two private information models
  • Multi-level model
    • Private information items are managed in hierarchically organized categories
    • Information flow from lower level (less private) to higher level (more private)
      • E.g. when user trusts SN with level 3, s/he also trusts SN with levels 1 and 2
    • Simple model
    • Easy for users to understand
    • Less descriptive
private information model11
Private information model
  • Discretionary model—— a set-based model
    • Private information items are organized into sets
    • Private information items in one set could be released together
    • Private information item may belong to multiple sets
  • Private information disclosure model
    • Formally describes:
      • out-of-context information disclosure
      • information aggregation attacks

under discretionary model.

    • Details: please refer to the paper
privacy sandbox
Privacy sandbox
  • Picks a privacy model
  • Allows users to describe their privacy plan in the model, i.e. how they want to arrange private information items
    • E.g. define privacy information sets under discretionary model
    • Define how sets could be released to social networks with different openness level.
  • Keeps privacy plans
passive monitor
Passive monitor
  • Passive monitor
    • is triggered when users send information to social networks
    • Alerts users
      • who can access the submitted information
      • Openness level
      • Access equivalency group
    • Checks against the privacy plan
    • Keeps a local log of private information disclosure
      • For future use
remote component and active monitor
Remote Component and Active monitor
  • Remote component
    • Actively collects personal information from various social networks
    • Simulates in-network and cross-network information aggregation
    • Stores information in a data repository
  • Active monitor
    • Compares users’ privacy plans with
      • Local log
      • Remote data repository
      • Search engine results
    • Checks for discrepancy
      • Warns user about unwanted information disclosure
  • In this paper, we
    • present privacy vulnerabilities over social networks, especially information aggregation attacks
    • model social networks and private information disclosure from access control perspective
    • describe information aggregation attacks in the model
    • propose our initial design of a privacy monitor
  • This is our preliminary proposal
  • Further analysis and implementation is on-going
  • Thanks a lot!