privacy management in ubiquitous computing environment n.
Skip this Video
Loading SlideShow in 5 Seconds..
Privacy Management in Ubiquitous Computing Environment PowerPoint Presentation
Download Presentation
Privacy Management in Ubiquitous Computing Environment

Privacy Management in Ubiquitous Computing Environment

183 Views Download Presentation
Download Presentation

Privacy Management in Ubiquitous Computing Environment

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. Privacy Management in Ubiquitous Computing Environment Jin Zhou Ho Geun An Priyanka Vanjani Kwane E. Welcher

  2. Summary • Introduction (Jin) • Internet Privacy (Ho An) • Privacy in E-Commerce (Priyanka) • Privacy in Ubiquitous Computing (Jin) • Policy-Based Control (Kwane) • Trust and Reputation (Jin) • Conclusion (Jin)

  3. Introduction • Ubiquitous Computing promises a world where computational artifacts embedded in the environment will continuously sense our activities and provide services based on what is sensed It is thought of third wave in computing and just beginning

  4. Scenario

  5. Properties of Ubicomp • Ubiquity • Invisibility • Sensing • Memory Amplification

  6. Privacy Problems

  7. Example Scenario • Alice is visiting a city • Use Bob’s location service • Alice’s location is stored in Bob’s Server • Bob may sell Alice’s information to Carol

  8. Fair Information Practices • Notice/Awareness • Choice/Consent • Access/Participation • Integrity/Security • Enforcement/Redress

  9. Internet • Internet is one of the biggest parts of ubiquitous computing environment. • Based on End User Centric Architecture • Benefits: • Flexibility / Generality / Openness • Disadvantages: • End user care about the privacy protection • the place where privacy violations occur most often today.

  10. Personal Information on Internet • Medium • Web site / email / IM / Chat room / bulletin board / p2p network / voice / video communication • Personal Information • Name / Address / SSN /Credit Card Number / User behavior

  11. Threats • Four factors make data collector much easier to gain personal information: In order to reach public, one must • advertise • use well-known protocols and standards • reveal one’s content • accept that one may come under the scrutiny of the authorities

  12. Threats • The widely used protocols (e.g. TCP / IP / HTTP / DNS) and applications do not support any kind of protection for privacy. • By using these factors, data collectors are gathering personal information over the network without notice and consent. • There are several data storages/flows on network that contains personal information and being targeted by data collectors. • DNS / URL / Cookie / Scripting

  13. DNS Server • A DNS server resolves the host names found in Uniform Resource Locators (URL) into a numeric Internet address [RFC1035] • Since there is no assurance in the protocol that replies from DNS server are genuine and not have been tampered with, DNS spoofing would deceive users and extract sensitive information. • Structural remedies for the DNS vulnerabilities are available but not widely deployed. • The Domain Name System Security Extensions [RFC2065]

  14. URL Leak • URL “” contains user ID and password. • There are many ways that referenced URLs leak: • History / referrer / logs • Solution • HTTPS

  15. Cookie Exposure • Cookie is a message given to web browser by a web server. • Main purpose of cookie is to identify users and possibly prepare customized WebPages for them. • Cookie is used for basically two ways: tracking users and authenticating users. • Unfortunately, there is no standard mechanism to establish the integrity of a cookie returned by a browser • Best defense is to avoid shopping online or registering with online services that use unsafe cookie-based authentication.

  16. Recommends • [RFC2964] recommends proper use of cookie: • the user is aware that cookie is being maintained and consent to it. • The user has the ability to delete the cookie associated with such a session at any time. • The information obtained through the cookies is not disclosed to other parties without the user’s explicit consent. • Session information itself cannot contain sensitive information and cannot be used to obtain sensitive information.

  17. Cross Site Scripting (CSS) • CSS is a type of computer security vulnerability typically found in web application which allows malicious web users to inject client side script (Javascript or HTML) or ActiveX controls into the web pages, e-mail messages, instant messages, newsgroup posting, or various other media. Victim users may unintentionally execute the script without any notice. • A CSS vulnerability could potentially be used to collect HTTP Cookies or the URL history and disseminate the data to an unauthorized party.

  18. Prevent CSS • Web administrator must filter a user-supplied data: • All non-alphanumeric client-supplied data (possibly contains malicious script) should be converted to HTML character entities before being re-displayed to other clients. • For end users, the most effective way to prevent CSS attacks is to disable all scripting languages in their web browsers. • They should be careful to click links on untrusted web pages or e-mails. • Also they should not install any ActiveX controls from untrusted web sites.

  19. Addressing Privacy in E-Commerce • E-Commerce: Business conducted over the internet using any of the applications that rely on internet • Email, Web Services, Online Shopping

  20. Data • Implicit: Personalization is gathered from information inferred from a user. • Explicit: Requires demographics, rating or other user information provided explicitly by the user.

  21. Privacy Risks • Users fear that their information might be shared with other organizations and/or companies. Fear of undesired marketing. • Users are concerned about how the information they have provided would be used. • Risk of a website not being run by a trusted organization and the information stored in their database. • Information might be distributed amongst other unwanted websites, or may be used by other organizations • Fear of online activities being tracked

  22. User Concerns • Most of the users do not care much about factors like: • If a site has privacy policy posted • if the site has a data retention policy • if the site has a privacy seal • This is because hey are not well aware of the importance of the above factors

  23. Protecting Privacy • P3P • One of the solutions in protecting privacy as far as E-Commerce is concerned Enables websites to express their privacy practices in a standard format which is convenient for user agents to retrieve and interpret.

  24. HTTP Transaction with P3P added

  25. Summary of P3P • P3P is not an "Enforcement Mechanism" • Facilitates better communication • P3P Version 1.0: Goal of the specification: • To make user agents aware of the practices that websites follow to collect data.

  26. TRUSTe • TRUSTe: Certifies, Monitors a websites privacy policies, email policies and is also aimed towards resolving consumer privacy problems. • TRUSTe developed the first online privacy seal program • the TRUSTe Watchdog—an alternative dispute resolution mechanism that allows you to submit any privacy violations by an accredited site directly to TRUSTe via the Web.

  27. Conclusion of E-Commerce Privacy • Users nowadays have strong opinion regarding privacy online and they tend to make their own assumptions about the data collection and the results turn out to be quite unfavorable. • It is vital to have more concrete and full-proof data nowadays regarding E-Commerce and privacy technologies in order to improve and win over user‘s trust and expectations.

  28. Privacy in Ubicomp Environment • Principle of Minimum Asymmetry • Anonymization and Pseudonymization • P3P • PawS • Wearable • Other Mechanisms

  29. Principle of Asymmetry • Negative externalities are often much harder to overcome in environments with significant asymmetry in both information and power between different parties. • Principle of Minimum Asymmetry • Decreasing the flow of information from data owners to data collectors and users • Increasing the flow of information from data collectors and users back to data owners

  30. Principle of Minimum Asymmetry

  31. Approximate Information Flow • Information Spaces • Storage perspective • Data Lifecycle • Dataflow perspective • Themes for minimizing Asymmetry • End-user perspective

  32. Information Spaces Properties: Lifetime Accuracy Confidence Boundaries: Physical Social Activity-based Operations: Addition/Deletion/Update Authorization/Revocation Promotion/Demotion Composition/Decompostion Fusion/Inference

  33. Data Lifecyle • Collection • Access • Second Use

  34. Themes for Minimizing Asymmetry • Prevention • Avoidance • Detection

  35. Design Space

  36. Anonymization and Pseudonymization • Anonymity precludes association of data or a transaction with a particular person. • However, services which require presence of users are not possible with anonymity, in that case, pseudonymity is required. • With user selected pseudonyms, users can interact with the environment in an anonymous way by having a pseudo identity. • Nevertheless, pseudonymity can be compromised at times as the user is physically present there and be identified at times.

  37. P3P • A framework for standardized, machine readable privacy policies. • Relieve the problem of time consuming process of reading policy. • Enabled web browser can decide what to do by comparing this policy with the user's stored preferences. • An XML file or in the HTTP header

  38. An Example P3P File

  39. Main Content of a Policy • which information the server stores: • which kind of information is collected (identifying or not); • which particular information is collected (IP number, email address, name, etc.); • use of the collected information: • how this information is used (for regular navigation, tracking, personalization, telemarketing, etc.); • who will receive this information (only the current company, third party, etc.); • permanence and visibility: • how long information is stored; • whether and how the user can access the stored information (read-only, optin, optout).

  40. Privacy Awareness System (PawS) • Based on Fair Information Practices • Mainly focuses on four principles: • Notice • Policy announcement mechanisms • Choice and Consent • Machine readable policies • Proximity and locality • Access restriction based on location. • Access and recourse • Privacy proxies / privacy-aware databases

  41. Overview of PawS

  42. Wearable • Instead of putting sensors and cameras in the room put them on the person. • Suited to providing privacy and personalization. • Have trouble with localized information, localized control and resource managemen

  43. Other Approaches • Location privacy policy • Individual should be able to adjust the accuracy of his location, identity, time and speed and therefore have the power to enforce the need-to-know principle • Privacy Mirror • provides feedback to end-users, showing them what information is being collected, and what information has been accessed and by whom.

  44. Policy Based Privacy

  45. Personal Privacy Policies • Policies defined • Personal privacy policy defined • Proposed personal privacy model

  46. Personal Privacy Policy Model

  47. Personal Privacy Policy Content • Model Code for the Protection of Personal Information • Privacy risk analysis questions

  48. Model Code for the Protection of Personal Information 10 Principles • Accountability • Identifying Purpose • Consent • Limiting Collection • Limiting Use, Disclosure, Retention

  49. Model Code for the Protection of Personal Information 10 Principles • Accuracy • Safeguards • Openness • Individual Access • Challenging Compliance

  50. Personal Privacy Policy Sample