1 / 119

Data Mining2

Data Mining2. Fosca Giannotti and Mirco Nanni Pisa KDD Lab, ISTI-CNR & Univ. Pisa http://www-kdd.isti.cnr.it/. DIPARTIMENTO DI INFORMATICA - Università di Pisa anno accademico 2012/2013. Privacy: Regulations and and Privacy Aware Data Mining. Plan of the Talk. Privacy Constraints Sources:

krista
Download Presentation

Data Mining2

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Data Mining2 Fosca Giannotti and Mirco Nanni Pisa KDD Lab, ISTI-CNR & Univ. Pisa http://www-kdd.isti.cnr.it/ DIPARTIMENTO DI INFORMATICA - Università di Pisa anno accademico 2012/2013

  2. Privacy: Regulations and and Privacy Aware Data Mining Anno accademico, 2011/2013 Reg. Ass.

  3. Plan of the Talk • Privacy Constraints Sources: • EU rules • US rules • Safe Harbor Bridge • Privacy Constraints Types: • Individual (+ k-anonymity) • Collection (Corporate privacy) • Result limitation • Classes of solutions • Brief State of the Art of PPDM • Knowledge Hiding • Data Perturbation and Obfuscation • Distributed Privacy Preserving Data Mining • Privacy-aware Knowledge Sharing

  4. Data Scientist • … a new kind of professional has emerged, the data scientist, who combines the skills of software programmer, statistician and storyteller/artist to extract the nuggets of gold hidden under mountains of data. • Hal Varian, Google’s chief economist, predicts that the job of statistician will become the “sexiest” around. Data, he explains, are widely available; what is scarce is the ability to extract wisdom from them.

  5. Cosa compriamo Con chi interagiamo Dove andiamo Cosa cerchiamo

  6. Definition of privacy What is privacy?

  7. Global Attention to Privacy • Time (August 1997) • The Death of Privacy • The Economist (May 1999) • The End of Privacy • The European Union (October 1998) • Directive on Privacy Protection • The European Union (January2012) • Proposal for new Directive on Privacy Protection • New deal on personal data : World Economic Forum 2010-2013

  8. Time: The Death of Privacy • Invasion of privacy • Our right to be left alone has disappeared, bit by bit, in little brotherly steps. • Still, we've got something in return, and it's not all bad

  9. The Economist • Remember, they are always watching you. Use cash when you can. Do not give your phone number, social-security number or address, unless you absolutely have to. Do not fill in questionnaires or respond to telemarketers. Demand that credit and data-marketing firms produce all information they have on you, correct errors and remove you from marketing lists.

  10. Web Users: Attitudes Do provide personal information 27% Never provide personal information 17% 56% Depending on privacy measures Source: Special Issue on Internet Privacy. Ed. L.F.Cranor (Feb 1999)

  11. Web Users: Privacy vs Benefits 86% of Web Users believe that participation in information-for-benefits programs is a matter of individual privacy choice Privacy fundamentalists 4% Having a privacy policy doesn’t matter as long as I got benefits 14% 82% Having a privacy policy would matter Source: Freebies and privacy: What net usersthink. A.F.Westin (July 1999)

  12. Definition of privacy What is privacy?

  13. European legislation for protection of personal data • European directives: • Data protection directive (95/46/EC) and proposal for a new EU directive (25 Jan 2012) • http://ec.europa.eu/justice/newsroom/data-protection/news/120125_en.htm • ePrivacy directive (2002/58/EC) and its revision (2009/136/EC)

  14. EU: Personal Data • Personal data is defined as any information relating to an identity or identifiable natural person. • An identifiable person is one who can be identified, directly or indirectly, in particular by reference to an identification number or to one or more factors specific to his physical, physiological, mental, economic, cultural or social identity.

  15. EU: Processing of Personal Data • The processing of personal data is defined as any operation or set of operations which is performed upon personal data, whether or not by automatic means, such as: • collection, • recording, • organization, • storage, • adaptation or alteration, • retrieval, • consultation, • use, • disclosure by transmission, • dissemination, • alignment or combination, • blocking, • erasure or destruction.

  16. EU Privacy Directive requires: • That personal data must be processed fairly and lawfully • That personal data must be accurate • That data be collected for specified, explicit and legitimate purposes and not further processed in a way incompatible with those purposes • That personal data is to be kept in the form which permits identification of the subject of the data for no longer than is necessary for the purposes for which the data was collected or for which it was further processed • That subject of the data must have given his unambiguous consent to the gathering and processing of the personal data • If consent was not obtained from the subject of the data, that personal data be processed for the performance of a contract to which the subject of the data is a party • That processing of personal data revealing racial or ethnical origin, political opinions, religious or philosophical beliefs, trade union membership, and the processing of data concerning health or sex life is prohibited

  17. EU Privacy Directive • Personal data is any information that can be traced directly or indirectly to a specificperson • Use allowed if: • Unambiguous consent given • Required to perform contract with subject • Legally required • Necessary to protect vital interests of subject • In the public interest, or • Necessary for legitimate interests of processor and doesn’t violate privacy • Some uses specifically proscribed (sensitive data) • Can’t reveal racial/ethnic origin, political/religious beliefs, trade union membership, health/sexlife

  18. Anonymity according to 1995/46/EC • The principles of protection must apply to any information concerning an identified or identifiable person; • To determine whether a person is identifiable, account should be taken of all the means likely reasonably to be used either by the controller or by any other person to identify the said person; • The principles of protection shall not apply to data rendered anonymous in such a way that the data subject is no longer identifiable;

  19. US Healthcare Information Portability and Accountability Act (HIPAA) • Governs use of patient information • Goal is to protect the patient • Basic idea: Disclosure okay if anonymity preserved • Regulations focus on outcome • A covered entity may not use or disclose protected health information, except as permitted or required… • To individual • For treatment (generally requires consent) • To public health / legal authorities • Use permitted where “there is no reasonable basis to believe that the information can be used to identify an individual”

  20. The Safe Harbor “atlantic bridge” • In order to bridge EU and US (different) privacy approaches and provide a streamlined means for U.S. organizations to comply with the European Directive, the U.S. Department of Commerce in consultation with the European Commission developed a "Safe Harbor" framework. • Certifying to the Safe Harbor will assure that EU organizations know that US companies provides “adequate” privacy protection, as defined by the Directive.

  21. The Safe Harbor “atlantic bridge” • Data presumed not identifiable if 19 identifiers removed (§ 164.514(b)(2)), e.g.: • Name, • location smaller than 3 digit postal code, • dates finer than year, • identifying numbers • Shown not to be sufficient (Sweeney)

  22. Privacy by design principle • In many cases (e.g., all previous questions!), it is possible to reconcile the dilemma between privacy protection and knowledge sharing • Make data anonymous with reference to social mining goals • Use anonymous data to extract knowledge • Only a little loss in data quality often earns a strong privacy protection

  23. ePrivacy Directive • GOAL: • the protection of natural and legal persons w.r.t. the processing of personal data in connection with the provision of publicly available electronic communications services in public communications networks.

  24. Topics related to (mobility) Data Mining • Location data • any data processed indicating the geographic position of the terminal equipment of a user of a publicly available electronic communications service • Traffic Data • any data processed for the purpose of the conveyance of a communication on an electronic communications network or for the billing thereof • Value added Services • any service which requires the processing of traffic data or location data other than traffic data beyond what is necessary for the transmission of a communication or the billingthereof • Examples: route guidance, traffic information, weather forecasts and tourist information.

  25. Location/Traffic Data Anonymization • Location data and Traffic data must be erased or made anonymous when it is no longer needed for the purpose of the transmission of a communication and the billing • Location/Traffic Data anonymization for providing Value added Services

  26. EU Directive (95/46/EC) and new Proposal • GOALS: • protection protection of individuals with regard to the processing of personal data • the free movement of such data

  27. New Elements in the EU Proposal • Principle of Transparency • Data Portability • Right of Oblivion • Profiling • Privacy by Design

  28. Transparency & Data Portability • Transparency: • Any information addressed to the public or to the data subject should be easily accessible and easy to understand • Data Portability: • The right to transmit his/her personal data from an automated processing system, into another one

  29. Oblivion & Profiling • Right to Oblivion: • The data subject shall have the right to obtain the erasure of his/her personal data and the abstention from further dissemination of such data • Profiling: • The right not to be subject to a measure which is based on profiling by means of automated processing

  30. Privacy by Design • The controller shall implement appropriate technical and organizational measures and procedures in such a way that the data processing • will meet the requirements of this Regulation • will ensure the protection of the rights of the data subject

  31. Privacy by Design in Data Mining • Design frameworks • to counter the threats of privacy violation • without obstructing the knowledge discovery opportunities of data mining technologies • Trade-off between privacy quantification and data utility

  32. Privacy by Design in Data Mining • The framework is designed with assumptions about • The sensitive datathat are the subject of the analysis • The attack model, i.e., the knowledge and purpose of a malicious party that wants to discover the sensitive data • The target analytical questions that are to be answered with the data • Design a privacy-preserving framework able to • transform the data into an anonymous version with a quantifiable privacy guarantee • guarantee that the analytical questions can be answered correctly, within a quantifiableapproximation that specifies the data utility

  33. Plan of the Talk • Privacy Constraints Sources: • EU rules • US rules • Safe Harbor Bridge • Privacy Constraints Types: • Individual (+ k-anonymity) • Collection (Corporate privacy) • Result limitation • Classes of solutions • Brief State of the Art of PPDM • Knowledge Hiding • Data Perturbation and Obfuscation • Distributed Privacy Preserving Data Mining • Privacy-aware Knowledge Sharing

  34. Traces • Our everyday actions leave digital traces into the information systems of ICT service providers. • mobile phones and wireless communication, • web browsing and e-mailing, • credit cards and point-of-sale e-transactions, • e-banking • electronic administrative transactions and health records, • shopping transactions with loyalty cards

  35. Traces: forget or remember? • When no longer needed for service delivery, traces can be either forgotten or stored. • Storage is cheaper and cheaper. • But why should we store traces? • From business-oriented information – sales, customers, billing-related records, … • To finer grained process-oriented information about how a complex organization works. • Traces are worth being remembered because they may hide precious knowledge about the processes which govern the life of complex economical or social systems.

  36. THE example: wireless networks • Wireless phone networks gather highly informative traces about the human mobile activities in a territory • miniaturization • pervasiveness • 1.5 billions in 2005, still increasing at a high speed • Italy: # mobile phones ≈ # inhabitants • positioning accuracy • location technologies capable of providing increasingly better estimate of user location

  37. THE example: wireless networks • The GeoPKDD – KDubiq scenario • From the analysis of the traces of our mobile phones it is possible to reconstruct our mobile behaviour, the way we collectively move • This knowledge may help us improving decision-making in mobility-related issues: • Planning traffic and public mobility systems in metropolitan areas; • Planning physical communication networks • Localizing new services in our towns • Forecasting traffic-related phenomena • Organizing logistics systems • Avoid repeating mistakes • Timely detecting changes.

  38. Opportunities and threats • Knowledge may be discovered from the traces left behind by mobile users in the information systems of wireless networks. • Knowledge, in itself, is neither good nor bad. • What knowledge to be searched from digital traces? For what purposes? • Which eyes to look at these traces with?

  39. The Spy and the Historian • The malicious eyes of the Spy– or the detective – aimed at • discovering the individual knowledge about the behaviour of a single person (or a small group) • for surveillance purposes. • The benevolent eyes of the Historian– or the archaeologist, or the human geographer – aimed at • discovering the collective knowledge about the behaviour of whole communities, • for the purpose of analysis, of understanding the dynamics of these communities, the way they live.

  40. The privacy problem • the donors of the mobility data are ourselves the citizens, • making these data available, even for analytical purposes, would put at risk our own privacy, our right to keep secret • the places we visit, • the places we live or work at, • the people we meet • ...

  41. The naive scientist’s view (1) • Knowing the exact identity of individuals is not needed for analytical purposes • Anonymous trajectories are enough to reconstruct aggregate movement behaviour, pertaining to groups of people. • Is this reasoning correct? • Can we conclude that the analyst runs no risks, while working for the public interest, to inadvertently put in jeopardy the privacy of the individuals?

  42. Unfortunately not! • Hiding identities is not enough. • In certain cases, it is possible to reconstruct the exact identities from the released data, even when identities have been removed and replaced by pseudonyms. • A famous example of re-identification by L. Sweeney

  43. She purchased the voter registration list for Cambridge Massachusetts 54,805 people 69% unique on postal code and birth date 87% US-wide with all three (ZIP + birth date + Sex) Solution: k-anonymity Any combination of values appears at least k times Developed systems that guarantee k-anonymity Minimize distortion of results Re-identifying “anonymous” data (Sweeney ’01)

  44. Private Information in Publicly Available Data Sensitive Information Medical Research Database

  45. Linkage attack: Link Private Information to Person Quasi-identifiers Victor is the only person born 08-02-57 in the area of 07028… Ha, he has a history of stroke!

  46. Sweeney’s experiment • Consider the governor of Massachusetts: • only 6 persons had his birth date in the joined table (voter list), • only 3 of those were men, • and only … 1 had his own ZIP code! • The medical records of the governor were uniquely identified from legally accessible sources!

  47. The naive scientist’s view (2) • Why using quasi-identifiers, if they are dangerous? • A brute force solution: replace identities or quasi-identifiers with totally unintelligible codes • Aren’t we safe now? • No! Two examples: • The AOL August 2006 crisis • Movement data

More Related