Privacy and User Generated Content - PowerPoint PPT Presentation

teagan-miller
privacy and user generated content n.
Skip this Video
Loading SlideShow in 5 Seconds..
Privacy and User Generated Content PowerPoint Presentation
Download Presentation
Privacy and User Generated Content

play fullscreen
1 / 22
Download Presentation
Privacy and User Generated Content
88 Views
Download Presentation

Privacy and User Generated Content

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. Privacy and User Generated Content Lauren Gelman Center for Internet and Society Stanford Law School cyberlaw.stanford.edu

  2. “Web 2.0 Is About Controlling Data” -Tim O'Reilly, Wired News, 4.13.07 • [Web 2.0] It's really about data and who owns and controls, or gives the best access to, a class of data. • As far as I'm concerned, web 2.0 is still in it's really early stages, and the reason is because the data isn't all owned yet.

  3. Overview • User’s privacy experience is a combination of law and technology • Nothing inherently bad for privacy in providing services that people want • Concern is deployment • Development of “user expectation”

  4. Defining the Privacy issue for Web 2.0 • Use of PI information you provide to one entity for one purpose by another entity for another purpose. • Commodification of digital dossier • Webcrawlers, search, and sales • Permanence of the data. • User generated or company collected • Link between online and offline identity • Market demand is not be best way to evaluate user privacy concerns absent adequate notice.

  5. Massive Change in the nature of advertising • “Ad networks and search engines such as Google can now target banner ads to customers who have demonstrated an interest in content related to the ad, even if the page has nothing to do with the advertiser's product.” • -Businessweek.com 4.14.07

  6. Massive Societal Change • Distorts boundary between • public and private spaces • Intimate and extended networks • Public and private time • What we do is influenced by who else knows what we’re doing. • Eliminates opportunity to experiment while young (myspace vs. basement/diary) • Loss of Control (who owns transactional data) • Pecuniary harm (identity theft)

  7. The Law • Constitutional- “expectation of privacy” • Statutory- “silo approach” treats different kinds of information differently • Medical (HIPPA) • Financial (GLB) • Video (VPPA) • Cable (Cable Act) • Policy- privacy and other policies • Dmca notice and takedown • CDA limitation on liability

  8. Top-Level Privacy Questions • What information do you collect, is it PII, how long is it held for? • Who do you share it with and under what circumstances? • Do you augment this information with data from other sources? • What internal protections do you have to prevent disclosures?

  9. Building Privacy In • Interface • How do you know if you’re “live” • Opt in/opt out • Collection of information • Who holds it • How long is it kept • Is it personalized • Third party access • In what country?

  10. Think about these things early! • What does the user want? • What do your partners “really” need? • What might third parties come looking for? • What kind of press can you look forward to? • Where might the law go? • Innovate in privacy!

  11. Privacy Policy Generator • Internet-based application • Features: • Enables web companies to create Privacy Policies • Informes user about requirements • Gives background about the privacy landscape

  12. Part of a Joint Project • Generator for • Terms of Service • Privacy Policies • Participants • David Hornik, August Capital • Cyberlaw Clinic at Stanford Law School • Berkman Center at Harvard Law School

  13. Previous initiatives • P3P (http://www.w3.org/P3P) • Privacy Bird(http://www.privacyfinder.org) • OECD - Privacy Statement Generator(http://www2.oecd.org/pwv3) • Others (see http://www.w3.org/P3P/implementations.html)

  14. Improvements • informed choice • educational explanations • explanations of the provisions which may be chosen • graphical tags • Creative commons model • Technical architecture • „EFF approved“

  15. Potential • Useful tool to reduce repetitive work • Educational benefit • Point of reference to learn about best practice • Retrievability (chicken and egg problem with privacy bird) • Data about companies‘ preferences

  16. Example

  17. Example What is your default? What preferences do you give to users who register? Do you collect identifying details from users who do not register, such as IP address? If so, your privacy policy will reflect this because your users should know their participation is not anonymous. Someone can connect the IP address you collect, with PII from the users ISP to unmask the user.

  18. Example (cont‘d)

  19. Example (cont‘d) • Privacy Policy will contain customized language:„orgname may collect Personally Identifiable Information through online forms, such as forms to register, order, contact us, sign up for a Newsletter, or the like, through your user profile or through your posts in blogs, on a bulletin board or on a comparable experience space on orgname’s web site.”

  20. Example (cont‘d) • Privacy Policy will also contain boilerplate language:„When you register with orgname or submit information to orgname, a temporary copy of that information may routinely and repeatedly be made to prevent accidental loss of your information through a computer malfunction or human error. Besides, orgname may keep your account information stored in a database.”

  21. Conclusion • There is a lot of good in this space, coupled with both positive and negative externalities. • Who is the party best able to address them? • Government(s)? • Lawyers? • Technologists? • -innovators?

  22. Privacy and User Generated Content Lauren Gelman Center for Internet and Society Stanford Law School cyberlaw.stanford.edu gelman@stanford.edu