1 / 22

Stuart Shapiro September 20, 2007

Catching the Drift While Missing the Boat: The Dangers of Substituting Security for Privacy Risk Management. Stuart Shapiro September 20, 2007. Approved for Public Release; Distribution Unlimited. 07-1222. Overview. Privacy Incident Roulette Privacy Risk (vs. Security Risk)

Download Presentation

Stuart Shapiro September 20, 2007

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Catching the Drift While Missing the Boat: The Dangers of Substituting Security for Privacy Risk Management Stuart Shapiro September 20, 2007 Approved for Public Release; Distribution Unlimited. 07-1222

  2. Overview • Privacy Incident Roulette • Privacy Risk (vs. Security Risk) • Impacts of Privacy Incidents • Applying Technology to Mitigate Privacy Risk

  3. Privacy Incident Roulette • Which is the privacy incident? • TJX consumer data breach • Pfizer employee data breach • Baby products traumatic marketing • Informational Privacy: The ability of an individual to exercise either direct or indirect control over their personally identifiable information (PII)

  4. TJX Consumer Data Breach • Network accessed and malware installed • Initial access may have been via WEP-protected WiFi • Some records likely captured as transactions were processed • Encrypted files may have been compromised through theft of the keys • Over 45 million records stolen • Credit and debit card information • Various identification numbers, including driver’s licenses • Some were SSNs

  5. Pfizer Employee Data Breach • Unauthorized file sharing software installed on laptop • Data relating to approximately 17,000 current and former employees exposed • Over 15,000 records accessed and copied over P2P network • Information included SSNs

  6. Baby Products Traumatic Marketing • Stillbirth at 31 weeks • As the official due date approached, an “onslaught” of baby product promotions began appearing in the parents’ mail • Promotions continued over the next year, tracking what would have been the child’s first months • Parents only shared the due date with • Their health insurer • A Web site for expecting and new parents • Opted out of sharing • Both the insurer and the Web site knew what had happened 2 months prior to the due date • Repeated requests required to stop just one company’s mailings • Gave up after that

  7. Which is the Privacy Incident? • TJX consumer data breach? • Pfizer employee data breach? • Baby products traumatic marketing? • They all are! • TJX and Pfizer: Security incidents that are also privacy incidents owing to the involvement of PII • Baby products: Privacy incident that has nothing to do with security • Consent: Permissions (mis)management

  8. Privacy Risk ≠Security Risk • Security risk involves the potential compromise of • Confidentiality (unauthorized access) • Integrity (unauthorized modification) • Availability (unauthorized denial of resources) • Privacy risk involves the potential compromise of one or more Fair Information Principles • Originally developed by the U.S. • 1973 HEW report • Basis of the Privacy Act of 1974 • Multiple versions now exist, e.g., • OECD • CSA • APEC • Basic consensus regarding the appropriate handling of PII

  9. Fair Information Principles for Privacy Risk Analysis • FTC Fair Information Principles (1998) • Notice/Awareness: Individuals should be informed of an entity’s information handling practices and the collection, use, disclosure, and retention of personal information should be limited to that which is consistent with stated purposes • Choice/consent: To the extent possible, options should be provided to individuals regarding the collection and handling of their personal information • Access/Participation: Individuals should have the ability to view and/or contest the data held about themselves • Integrity/Security: Personal information should be both accurate and protected • Enforcement/Redress: There should be mechanisms for identifying and addressing noncompliance with these principles • Privacy risk intersects, but is distinct from, security risk

  10. Privacy Risk Matrix

  11. Incident Impacts • Bad publicity • Loss of trust • Loss of customers • Government and/or private legal actions • $$$$ • Incident response • Notification, monitoring, compensation • Forensics • Remediation • Lost productivity • Customer attraction and retention • Market valuation • Damages

  12. Ponemon Institute Studies Bearing on Incident Impact and Cost • 2006 Privacy Trust Study of the United States Government • VA 4th most trusted government organization for privacy • 2007 Privacy Trust Study of the United States Government • VA 7th least trusted government organization for privacy • 41% drop from 2006 score • 2006 Annual Study on Cost of a Data Breach • 31 companies in 15 industry sectors • Breaches ranging from 2,500 to 263,000 records • Estimated • Direct cost • Lost productivity cost • Opportunity cost • Estimated average per record cost: $182

  13. Average Cost of a Data Breach

  14. Applying Technology to Mitigate Privacy Risk • Ultimate goal: Privacy-enabling architecture (PEA) • Systematic deployment of technical privacy controls and configurations so as to comprehensively address privacy risk • Controls should map to business processes as well as risks • Analogous to service-oriented architecture (SOA) • SOA implies the high-level system functional design • PEA should imply the high-level system privacy design • Sound complicated? • It is • So let’s make things more manageable by focusing on exposure as a risk concept • Exposure ≠ Breach • Exposure involves the relative accessibility of PII • Reduce exposure and you reduce privacy risk

  15. Some Privacy-Enabling Technologies (PETs) for Limiting Exposure • Mutual authentication • Encryption • Data masking

  16. Mutual Authentication • Trusted communication used to be established at only one end • Only user/client authenticated, service/server was assumed trustworthy • Phishing and pharming have invalidated that assumption, significantly increasing the potential for exposure • Web sites that by their nature involve sensitive PII are starting to employ site authentication schemes • Typically user-specified picture and text associated with username • Mutual authentication greatly increases the likelihood of actual trusted communication • Mutual authentication likely to become an infrastructural issue • WiFi • Cell phones • Distrust is corrosive

  17. Encryption • Shifting emphasis from data-in-motion to data-at-rest (DAR) • Too many organizations are focusing on DAR for mobile platforms exclusively • Avoiding the lost laptop/PDA nightmare • Physically restricted platforms and activities can still involve excessive exposure of PII • External hacking • Insider threat • Non-malicious misuse, improper sharing and disclosure • 2007 Ponemon Institute Study on U.S. Enterprise Encryption Trends • 16% of respondents reported an encryption strategy applied throughout the enterprise • 50% reported selective encryption based on application/data type or data sensitivity • 34% reported no encryption strategy at all

  18. Data Masking • De-identifying PII (removing all direct or indirect links to specific individuals) can substantially reduce (but not necessarily eliminate) exposure and its associated privacy risk • An increasing variety of transformations can maintain important relationships and properties of PII while still de-identifying it • One area where this can potentially pay big dividends is in system development and testing • Development and testing environments often do not implement the same level of controls as production environments • Forthcoming Ponemon Institute study on the use of live data outside the production environment

  19. Preliminary Results • 62% of respondents report their organization uses live data for software development • 69% report use of live data for testing • 89% report use of customer records for development and testing • 43% report use of employee records for development and testing • 41% report using no protective measures at all, such as • Suppression of sensitive data elements • Anonymization of PII • Replacement of PII with dummy data • Data encryption • 23% report that live data used for development and testing has been lost or stolen (38% unsure)

  20. In Conclusion • Privacy risk goes beyond security risk • Focusing on security risk will not necessarily control privacy risk • Everybody has PII • Customers • Employees • Business contacts • Shareholder information • Applicants • Visitors • Privacy-enabling technologies can help mitigate privacy risk, but • They need to be properly mapped to identified privacy risks • They need to be combined with appropriate policies and procedures

  21. Questions • Total evasions • Half-truths • Some actual answers

  22. Contact Information Stuart Shapiro The MITRE Corporation Bedford, MA sshapiro@mitre.org 781-271-4676

More Related