1 / 23

Design of Health Technologies lecture 20

Design of Health Technologies lecture 20. John Canny 11/21/05. Health Care Privacy. Health care privacy is an obvious concern for most people as we inch toward computerization and out-sourcing. The HIPAA (Health Insurance Portability and Accountability Act) was created in 1996.

Roberta
Download Presentation

Design of Health Technologies lecture 20

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Design of Health Technologieslecture 20 John Canny11/21/05

  2. Health Care Privacy Health care privacy is an obvious concern for most people as we inch toward computerization and out-sourcing. The HIPAA (Health Insurance Portability and Accountability Act) was created in 1996. There are actually several related HIPAA regulations: • HIPAA Privacy rule (1996) • HIPAA Security rule (1998) • HIPAA Title II “Administrative Simplification”, including transactions and code sets rules (part of 1996 Act).

  3. HIPAA data HIPAA protects “Protected Health Information” PHI: • Info created or received by a provider, employer etc. • Information that relates to the past, present or future physical or mental health or condition of an individual. • Information about the health care of an individual. • Information about payments relating to an individual. TPO is (Treatment, Payment, healthcare Operations): normal uses of PHI in the course of providing care.

  4. HIPAA Aug. 2002 changes (NPRM) Under the Bush administration in 2002, a number of changes were made which are known as NPRM (Notice of Proposed RuleMaking). In general, NPRM reduces the burdens on care providers to protect privacy and narrows the situations in which privacy regulations apply. In particular, written patient consent was required in the original HIPAA regulations for TPO uses. Post-NPRM, written consent is not required for TPO use of PHI.

  5. HIPAA allowed uses • Disclosure to the individual about whom the PHI applies • With individual consent or legal agreement related to carrying out TPO. • Without individual authorization for TPO, with some exceptions. PHI can usually be shared without disclosure to partner organizations in the course of providing TPO PHI can generally be used for other purposes if it is “de-indentified” – i.e. information that would allow tracing to the individual has been removed.

  6. HIPAA exceptions It is OK to use PHI for these purposes without individual consent, and opportunity to agree or object: • Quality assurance • Emergencies or concerns about public health & safety • Suspected abuse of the individual • Research** • Judicial and administrative proceedings • Law enforcement • Next-of-kin information • Government health data and specialized functions • Workers compensation • Organ donation

  7. HIPAA exceptions • Identification of the deceased, or for cause of death. • Financial institution payment processing • Utilization review • Credentialing • When mandated by other laws • Other activities that are part of ensuring appropriate treatment and payment

  8. Privacy requests • An individual may request restriction of use of their PHI in the course of TPO. • However, a provider is not required to accept those restrictions. • If it does agree, it is legally bound by that agreement.

  9. Compliance Dates • Healthcare providers, April 14, 2003 • Health Plans: • Large: April 14, 2003 • Small: April 14, 2004 • Healthcare clearinghouses, April 14, 2003

  10. Privacy Technology • Protecting medical privacy involves the usual set of access control problems – i.e. most of traditional cryptography and secure systems design. • It also poses some new challenges that are relatively well-defined: • “Private” Data mining of PHI: i.e. analysis of PHI records without actually access or risk to the personally-identifying information. • Analysis of identifiability and de-identification: Its non-trivial to determine how identifiable is an individual from particular information about them.

  11. Techniques - Anonymization • Anonymization is an umbrella term for a set of techniques that separate data records from individual users. • For PHI, an important concept is “linkability”, the potential linkage between a person and their information. Age: 35 Weight: 170 Alcohol use:## Diabetes?:## Allergies:## Joe Smith

  12. Techniques - Anonymization • One can break the link and replace it with a pseudonym, which still allows 2nd-order statistical analyses. • But unfortunately, even a modest amount of personal medical history may be unique to a patient. i.e. you can figure out who owns the record at right in many cases. Pseudo: 3143231Age: 35 Weight: 170 Alcohol use:## Diabetes?:## Allergies:## Joe Smith Pseudo: 3143231

  13. Techniques – Statistical perturbation • One can also change the actual values in a personal patient record by adding random “noise” so that the actual values are hard to determine: • Advantages: • Easy to do • Relatively easy to analyze the effects of perturbation • Disadvantages: • Trades privacy for accuracy, and may do a poor job at both. • Requires large populations for accurate statistical aggregates. • Doesn’t work for all types of aggregates. • If random offsets are repeated per patient, the averages converge to patient’s actual data. How to update patient data?

  14. Private Computation Boundary-awareprivate computation Information that service provider needs All user data Information userwants to keepprivate

  15. Private Computation • Example: e-voting algorithms User data is obfuscatedbefore going to server U 1 U 2 S 1 U Server can compute the final result only. 3 U 4

  16. Private Computation • Example: e-voting algorithms User data is obfuscatedbefore going to server U 1 U 2 S 1 U Server can compute the final result only. 3 U 4

  17. Private Arithmetic • There are two approaches: • Homomorphism: User data is encrypted with a public key cryptosystem. Arithmetic on this data mirrors arithmetic on the original data, but the server cannot decrypt partial results. • Secret-sharing: User sends shares of their data to several servers, so that no small group of servers gains any information about it.

  18. Challenges • Addition is easy with either method, multiplication is possible but very tricky in practice. • Homomorphism is expensive (10,000x more than normal arithmetic). • Secret-sharing is essentially free, but requires several servers.

  19. A hybrid approach • We proposed to use secret-sharing for privacy, and homomorphic computation to validate user data. • Secret-sharing works over normal (32 or 64-bit) integers and is very fast. • Homomorphism uses large integers (1024-bits) but with randomization we only need to do O(log n) operations. • The result is a method for vector addition with validation that runs in a few seconds. • Its not obvious, but vector addition is the building block for many (perhaps most) numerical data mining tasks.

  20. Secret-sharing • For secret-sharing, you need at least two servers that will not collude. Where do these come from?

  21. P4P: Peers for Privacy • In P4P, a group of users elects some “privacy providers” within the group. • Privacy providers provide privacy when they are available, but cant access data themselves. U P U U S U PeerGroup P U U

  22. P4P • The server provides data archival, and synchronizes the protocol • Server only communicates with privacy peers occasionally (e.g. once per day at 2AM). U P U U S U PeerGroup P U U

  23. Discussion Questions HIPAA still gets mixed reactions from providers and patients. Discuss its from both perspectives: • How privacy regulations can impede care-givers, researchers, HMOs etc. • How the regulations don’t go far enough in some scenarios. Discuss the two approaches to privacy protection, i.e. perturbation and private computation. What are some trade-offs you can see that were not mentioned in the papers?

More Related