1 / 35

Class 3 – April 6, 2012

Part 1: IT Policies – Privacy Part 2: IT Policies – Privacy – Personally Identifiable Information. Class 3 – April 6, 2012. Privacy concerns. National Association of State Chief Information Officers (NASCIO):

yamal
Download Presentation

Class 3 – April 6, 2012

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Part 1: IT Policies – Privacy Part 2: IT Policies – Privacy – Personally Identifiable Information Class 3 – April 6, 2012

  2. Privacy concerns • National Association of State Chief Information Officers (NASCIO): • Privacy is a particularly daunting challenge for state governments, because citizens have an expectation of openness and transparency. Yet, at the same time, states must foster citizens' trust by ensuring that their private information remains that way.

  3. Privacy concerns • Privacy issues are pervasive in e-government • Governments gather large amount of private data (e.g. social security information, health information, driver license) • Data once collected can be mined (i.e. patterns or habits could be identified)—most common for security (terrorism threats) • Reports of local governments losing data on private citizens (or unknowingly publishing the data) exist • 104 military and government breaches in 2010 – 1.9 million personal records released. • 2009 – 79.4 million records released!!!

  4. Privacy concerns • Business still accounted for most breaches: • Business – 42.1% • Medical and healthcare facilities – 24.2% • Federal/State agencies and military – 15.7% • Educational institutions – 9.8% • Banking industry – 8.2% Source – Nextgov.com at: http://www.nextgov.com/nextgov/ng_20110107_8262.php

  5. Computer Surveillance • Mass surveillance was once impossible due to the cost and practical impossibility of carrying it out • The central issue of electronic surveillance is how the laws governing surveillance are used and enforced. • Do law enforcement agencies follow the traditional model of investigation after a crime, or do they use technology for surveillance in an attempt to prevent crime? • Traditional model: • Evidence of crime obtained • Investigation ensues • Warrant sought from judge for surveillance of particular individuals for good cause

  6. Computer Surveillance • Traditional model altered by electronic surveillance techniques. • Lyon (2002) – “surveillance as social sorting” - online profiling, smart cards, biometrics, closed circuit television creating a new model of law enforcement. • New model: • Law enforcement with no evidence of a crime but have an interest in a particular type of crime and knowledge of indicators • Mass surveillance looking for indicators – no warrant required • Social sorting (filtering and profiling) to identify specific suspects who become targets of more intensive surveillance – warrant still may not be required under Patriot Act

  7. Computer Surveillance • Technological Determinists – warranted surveillance replaced by mass unwarranted surveillance through the force of technology alone. • Panopticon concept – complete compliance with rules due to total surveillance • Ideal prison where compliance guaranteed by inescapable surveillance – clear view of every inmate – Jeremy Bentham and Michael Foucault • Privacy is an issue because people have good reason to believe that data collected on them for one purpose may be appropriated and used for altogether different purposes.

  8. Computer Surveillance • Employees generally do not have privacy rights at work • Agency policies clearly define the employees rights and the lack of privacy with respect to activities conducted on agency computer systems • Splash screens are used to remind employees at each login

  9. Privacy Legislation • Katz v. United States (1967) • Long term surveillance was a violation of the Fourth Amendment • Short term generally met the test of Constitutionality if prior judicial approval obtained • Privacy Act, 1974 [amended: Computer Matching and Privacy Protection Act, 1988] • Regulates Federal agencies’ record keeping and disclosure practices. • Individuals can seek access to Federal agency records about themselves. • Stated purpose: Requires that agencies obtain information directly from the subject and that information gathered for one purpose may not be used for another purpose • Civil remedies for individuals whose rights may have been violated. • Provides that the subject may challenge the accuracy of information.

  10. Privacy Legislation • Privacy Act, 1974 [amended: Computer Matching and Privacy Protection Act, 1988] (continued) • Requires that each Federal agency publish a description of each system of records maintained by the agency that contains personal information. • Restricts the disclosure of personally identifiable information • Case of Terry Dean Rogan. Identity stolen by state prison escapee. Arrested 5 times because his identity associated with criminal. Not unique. Quite a few similar situations. Ultimately sued and was compensated. National Crime Information Center database updated with field to indicate use of stolen identities to prevent future occurrences. Lesson – Sometimes too little information is the problem rather than too much. • Some agencies specifically prohibited from dissemination of individual-level information by law, such as IRS, Census, and Social Security. On state level, same with DOR. • Exceptions for publicizing tax cheats, pedophiles, sex offenders, criminal records, etc. Some not necessarily statutory, but accepted as exceptions generally.

  11. Privacy Legislation • Communications Assistance for Law Enforcement Act of 1994 (CALEA) • Intended to preserve the ability of law enforcement to conduct electronic surveillance by requiring that telecommunications carriers and manufacturers modify and design their equipment, facilities, and services to ensure they have the necessary surveillance capabilities. • Conduct lawfully-authorized electronic surveillance while preserving public safety, right to privacy, and telecom competitiveness • Requires Telecommunications carriers to ensure: • Expeditious isolation and interception of communications content; • Expeditious isolation and access to call-identifying information; • Delivery of communications content and call-identifying information; • Unobtrusive interception and access to call-identifying information • Protection of the privacy and security of communications not authorized to be intercepted. • Telecom carriers: Common carriers, broadband providers, and VOIP

  12. Privacy Legislation • Patriot Act, 2001 • Enables governments to monitor telephone, e-mail communications, medical, financial, and other records • Also partially repealed laws against domestic spying and allowed government to monitor Web surfing, obtain records from ISPs, and the use of roving wiretaps to monitor phone calls. NOT limited to terrorism: • Can monitor legitimate protest groups • Monitor computer network traffic without court order • Take DNA from anyone convicted of a crime of violence (e.g. scuffling in a protest march) • Wiretapping anyone SUSPECTED of violating the Computer Fraud and Abuse Act • Authorizes “sneak and peak’” search warrants for any federal crime, including misdemeanors. Officers can enter private premises without informing occupants or obtaining permission, and do not have to inform absent occupants that a search was conducted. • Essentially, Patriot Act applies lower standards of privacy under the Foreign Intelligence Surveillance Act domestically to U.S. citizens

  13. Privacy Legislation • Patriot Act, 2001 – continued • 763 sneak and peek warrants in 2008 • 3 issued in relation to alleged terrorist offenses • 62% to investigate drug-trafficking offenses

  14. Agency Data Sharing and Matching • Some agencies are specifically prohibited from disclosing individual level data (US Census Bureau and IRS) • Organization for Economic Co-operation and Development Code of Information Practices • Collection Limitation Principle - Limits on collection of personal data; should be obtained by lawful and fair means; where possible with consent of subject. • Data Quality Principle – personal data should be relevant to purpose for which it is collected, and should be accurate, complete, and kept up to date. • Purpose Specification Principle - Purpose of personal data collection should be specified at time of data collection and subsequent use limited to those purposes or compatible purposes as specified on each change of purpose. • Use Limitation Principle - Personal data should not be disclosed, made available or used or otherwise used for purposes other than those specified in the Purpose Specification Principle unless consent of the subject is obtained or unless required under authority of law.

  15. Agency Data Sharing and Matching • Organization for Economic Co-operation and Development Code of Information Practices (continued) • Security Standards Principle - Personal data should be protected by reasonable security safeguards • Openness Principle - Policy of openness about developments, practices, and policies related to personal data. Ability to easily establish existence and nature of personal data, purpose of use, and identity and residence of individual responsible for control of the data. • Individual Participation Principle - Individual should be able to obtain confirmation whether or not controller has data relating to him; have the data provided to him at reasonable cost; be able to challenge any denial; and be able to challenge data related to him. • Accountability Principle - Data controller should be accountable for complying with above measures.

  16. Privacy Impact Statements • Federal agencies are required to post a privacy impact statement • Some countries require privacy impact studies and statements in conjunction with creation of new IT projects • Canada is a leader in this effort • OMB Guidelines for Privacy Impact • What information is to be collected? • Why is the information collected and who will be affected? • What notice of opportunities for consent is provided? • What security protocols are in place? • Does this program create a new system of records under Privacy Act? • What is the intended use of the information?

  17. Privacy Impact Statements • OMB Guidelines for Privacy Impact (continued) • Will the information be retained and for what period? • How will the public be able to seek redress? • What databases will names be run against? • Privacy effects and mitigation measures? • FY 2005 all federal agencies required to submit privacy assessments of major IT systems with annual business case submissions.

  18. The National ID Controversy • National ID cards have been suggested as a solution to better security at airports and other public facilities, reduction of voter fraud, and identity theft • There has traditionally been resistance to the idea due to negative historical connotations associated with totalitarian regimes • Real ID Act, 2005 [http://www.ncsl.org/standcomm/sctran/Realidsummary05.htm] • Uniform federal guidelines on driver license/ identification (DL/ID) standards and issuance procedures • DL/ID standards: At a minimum, a state shall include the following: (1) person’s full legal name, (2) person’s date of birth, (3) person’s gender, (4) DL/ID number, (5) digital photograph, (6), person's address of legal residence, (7) person’s signature, (8) physical security features designed to prevent tampering, counterfeiting or duplication for fraudulent purposes, and (9) a common machine-readable technology with defined data elements

  19. The National ID Controversy • Real ID Act, 2005 (continued) • DL/ID issuance procedures: ID is issued based on: (1) A photo-identity document (except that a non-photo identity document is acceptable if it includes both the person’s full legal name and date of birth); (2) Documentation showing the person’s date of birth; (3) Proof of the person’s social security account number (SSN) or verification that the person is not eligible for an SSN; (4) Documentation showing the person’s name and address of principal residence

  20. The National ID Controversy • Kent and Millett (2002) list numerous policy problems associated with implementation of a national ID system • How intrusive will national Ids be? Just for authentication or data retained to track transactions? Required for commercial transactions? • Who could use the data? Agencies? Corporations? Individuals? • Would it be mandatory or voluntary? • What rights would exist to see your data and have it corrected? • What penalties would exist for abuse of the system? • How could we prevent forgeries given current forgery capabilities now (currency and passports)? • Little evidence that national ID cards have an impact in prevention of attacks where used. Terrorists have used tourist visas (9/11) or have legitimate ID cards (Madrid bombings).

  21. Other Privacy issues • Outsourcing • A major source of loss of privacy comes from the commercial sector – private corporations trade SSNs, purchasing pattern information, and many other types of personal information gathered from the Internet and other sources • Privatization • IT makes the commoditization of personal information relatively easy • Private sector data mining • Credit card companies and other companies (e.g. Amazon) track spending behavior. • Rare to see cases against corporations for privacy violations. Corporations do with impunity what government cannot do.

  22. Part 2: IT Policies – Privacy – Personally Identifiable Information Class 3 – April 6, 2012

  23. Personally Identifiable Information • Any information about an individual maintained by an agency including: • Any information that can be used to distinguish or trace an individual’s identity, e.g., name, SS number • Any information that is linked or linkable to an individual, e.g., medical, educational, employment info • “Linked” information is that which is logically associated with other information about the individual • “Linkable” information is information for which there is a possibility of logical association

  24. Personally Identifiable Information • Example of linked and linkable: • PII exists on two databases, so someone with access to both may be able to link the data. If the secondary information is on the same system or related system and does not have security to segregate the two databases, then they are linked. If the secondary data is remote or available in public records, or is otherwise easily obtainable, then the information is linkable. • Source of information on PII – NIST Special Publication 800-122, Guide to Protecting the Confidentiality of Personally Identifiable Information (PII)

  25. Personally Identifiable Information • Examples of PII Data • Names • Personal identification numbers • Address information • Telephone information • Personal characteristics (fingerprints, biometrics) • Information regarding personally owned property • Information that is linkable through the use of any of the above PII

  26. The better ones are not free, but do require some level of authorization to use – however, private investigators and bill collectors can get access! Just using free resources can result in obtaining much of the same information available through the aggregators Using Accurint (or similar service) and free resources multiplies data available Information available from data aggregators: Names (all) used and social security numbers; names of others using that social security number Address summary going back for many years with demographic data for each address Bankruptcy information, liens and judgements, and UCC filings Phones utilized, including cell phones Companies owned and associates at work Driver’s license information and history Possible properties owned Motor vehicles registered and watercraft owned FAA certifications and aircraft owned Possible criminal records and sexual offenses Automobile accident details Professional licenses Voter registration, hunting permits, concealed weapons permits Possible associates Possible relatives Neighbors Aggregating PII Services exist that make it very simple to pull together a tremendous amount of personally linked data once sufficient information exists to identify the individual

  27. PII Impact Levels • Low – limited adverse effect – minor loss to individual or organization – having to change your phone number • Moderate – serious adverse effect – significant financial loss or significant harm but not loss of life. Identity theft, public humiliation • High – severe or catastrophic adverse effect on organizational operations, assets or individuals – major financial loss; severe or catastrophic harm to individuals involving loss of life or life-threatening injuries

  28. Factors for Determining PII Confidentiality Impact Levels • Factors will vary by organization based on mission and nature of PII maintained • Identifiability - how easily can PII be linked to an individual? Some data can directly identify individuals and linked data. Other data can be used to significantly narrow large datasets and make identification more likely. • Quantities of PII - very small vs. very large datasets represent differing levels of risk. You cannot ignore privacy considerations for small data sets, but impact level will generally be higher for datasets containing large numbers of records. • Data Field Sensitivity - must evaluate each field separately, plus sensitivity of all fields together. SSN or financial data more sensitive than a telephone number. Data can be sensitive in ways other than intended use, e.g., mother’s maiden name can be used can be used for authentication for password recovery

  29. Factors for Determining PII Confidentiality Impact Levels • Context of Use - purpose for which information is collected, stored, used, processed, disclosed, or disseminated. • Examples include eligibility for benefits, tax administration, and law enforcement. Simple disclosure that information is being collected might in itself be dangerous. Consider three lists, each containing name, address and phone number. The first is subscribers to a newsletter; the second people who have applied for retirement benefits; the third undercover law enforcement agents. Same information, very different impact levels. • Obligations to protect confidentiality - Obligations vary by organization based on the laws applicable to that organization’s PII activity. IRS data, for example, is subject to extremely strict confidentiality requirements. • Access to and location of PII - How many people have access? Is information accessible using mobile devices? Is information regularly transported offsite, say on a laptop? Is information available online?

  30. Operational Safeguards • Policy and Procedure Creation • Access rules for PII within the system - just because the information exists in an agency database does not mean everyone within that agency should have access. • PII retention schedules and procedures - Data should not be kept indefinitely. When it has served its purpose it should be purged. • PII incident response and data breach notification - Data incidents represent serious problems for an agency. Response and notification planning is crucial so that any damage can be contained quickly.

  31. Operational Safeguards • Policy and Procedure Creation (continued) • Privacy in the system development life cycle process - Data obtained during the development of IT systems may be available to contractors as well as employees. Protection of data during development and data conversion activities is just as important as after the implementation, and data may be easier to steal during development. • Limitation of collection, disclosure, sharing and use of PII - Do not collect anything that is not specifically needed; do not disclose or share any data without proper authorization and demonstrated need. • Consequences for failure to follow policy - without consequences there is little to deter sloppy information protection.

  32. Operational Safeguards • Awareness, training, and education • Awareness training designed to change behavior or reinforce PII practices. Focuses attention on protection of PII • Training builds knowledge and skills to enable staff to protect PII • Education builds a common body of knowledge covering all specialties and aspects of PII protection

  33. Topics for PII Training • The definition of PII • Applicable privacy laws, regulations, and policies • Restrictions on data collection, storage, and use of PII • Roles and responsibilities for using and protecting PII • Appropriate disposal of PII • Sanctions for misuse of PII • Recognition of a security or privacy incident involving PII • Retention schedules for PII • Roles and responsibilities in responding to PII-related incidents and reporting

  34. Privacy-Specific Safeguards • Minimizing the use, collection, and retention of PII • Basic privacy principle • What does the organization need to fulfill its mission? “Minimum necessary principle” • When no longer relevant – dispose of securely • Previously discussed Privacy Impact Assessments • De-identifying information – e.g., remove identifiers for researchers using a protected and secured algorithm that can re-link data when necessary

  35. Privacy-Specific Safeguards • Anonymizing information – de-identified information for which no algorithm for re-identification exists. Anonymizing to insure inability to re-identify: • Generalizing the information – less precise and grouped • Suppressing the data – deleting entire records or parts of records • Introduction of noise – adding small amounts of variation to the data • Swapping the data – exchanging certain information from one record with another, e.g. zip code fields • Replacing the data with an average value • Anonymized data very useful for systems testing and development. Randomly generated data tends not to share a realistic distribution and may not represent a proper testing of the system.

More Related