1 / 30

Privacy Concerns in Upcoming Residential and Commercial Demand Response Systems

Privacy Concerns in Upcoming Residential and Commercial Demand Response Systems. Mikhail Lisovich, Devashree Trivedi, and Stephen Wicker Department of Electrical and Computer Engineering Cornell University. Privacy in the Home.

dolan
Download Presentation

Privacy Concerns in Upcoming Residential and Commercial Demand Response Systems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Privacy Concerns in Upcoming Residential and Commercial Demand Response Systems Mikhail Lisovich, Devashree Trivedi, and Stephen Wicker Department of Electrical and Computer Engineering Cornell University

  2. Privacy in the Home • Privacy is the interest that individuals have in sustaining a 'personal space', free from interference by other people and organizations. • Privacy of the Person • Privacy of Personal Behavior • Privacy of Personal Communications • Privacy of Data

  3. Privacy in the Home Interested Parties: Police Employers Marketers Criminals Presence Dinner times Sleep schedule Appliances Shower times • ANY activity involving electricity, water, and gas

  4. Privacy in the Home Q:How real is the threat? A: Very. Three contributing factors: Technology: AMI/AMR, NILM (Nonintrusive Load Monitoring) Precedent for Repurposing: Drug production screening.Involves Austin Police Department, others. Legal Precedent: Smith v. Maryland US. v. Miller

  5. Outline • Introduction • Main Claim • Summary of TRUST Efforts • Background • Brief Overview • Interested Parties • Abuse Cases • Privacy Metric • Experiment • Overview • Experimental Setup • Algorithms • Results • Discussion • Algorithm effectiveness • Privacy Implications

  6. Outline • Introduction • Motivation • Summary of TRUST Efforts

  7. Motivation • Next generation demand-response architectures are increasingly deployed by major utilities across the US. • Advantages: cost savings in power generation, increased grid reliability, new modes of consumer-utility interaction. • Disadvantage: Increased availability of data creates or exacerbates issues of privacy and security. Our Main Claim: In a lax regulatory environment, the detailed household consumption data gathered by advanced metering projects can and will be repurposed by interested parties to reveal personally identifying information such as an individual's activities, preferences, and even beliefs.

  8. TRUST Efforts • Cornell, Berkeley School of Law have focused on the privacy risks arising from the collection of power consumption data in current and future demand-response systems. • Berkeley: law & policy aspects • D. Mulligan, J.Lerner have written an article in the Stanford Technology Law Review chronicling the evolution of court opinion toward energy data privacy and calling for its constitutional protection. • Collaborated with the California Public Utilities Commission (CPUC) to develop a set of draft guidelines for a secure and privacy-preserving demand response infrastructure. • Cornell: technological aspects • Highlighted the importance of NILM algorithms for extrapolating activity. • Proposed a formal way of evaluating privacy risks. • Conducted a proof-of-concept technical study.

  9. Outline • Introduction • Motivation • Summary of TRUST Efforts • Background • Brief Overview • Interested Parties • Abuse Cases

  10. Technical Overview • Advanced Metering Infrastructure (AMI) • Collects time-based data at daily, hourly or sub-hourly intervals

  11. Technical Overview (contd.) • Non-Intrusive Load Monitoring (NILM) • NILM: fundamental tool for extrapolating activity

  12. Players/Abuse Cases • Law Enforcement • Detecting Drug Production. • Supreme Court boundaries (as such):: • Kyllo v. US - Information obtained, using sensors, about activity within the home that would not otherwise have been available without intrusion constitutes a search • Smith v. Maryland, US v. Miller - records freely given to third parties not protected under 4th Amendment • Employers • Employee Tracking • Marketing Partners • Criminals

  13. Outline • Introduction • Motivation • Summary of TRUST Efforts • Background • Brief Overview • Interested Parties • Abuse Cases • Privacy Metric

  14. Privacy Metric • Goal: a metric which associates the degree of data availability (accuracy of readings, time resolution, types of readings, etc) with potential privacy risks, providing a robust and reliable indicator of overall privacy. • Extrapolating activity may be thought of in two stages • Firststage: NILM in combination with data from other sensors is used to extract appliance usage, track an individual's position, and match particular individuals to particular observed events. • Secondstage: intermediate data is combined with contextual data (such as the number/age/sex of individuals in the residence, tax and income records, models of typical human behavior). • Performance Evaluation: • First stage: at most, the gathered information will reveal everything that's happening in the house (precise information about all movements, activities, and even the condition of appliances) • Second stage: more difficult to define an absolute performance metric - the number of specific preferences and beliefs that can be estimated is virtually limitless. In order to develop a comprehensive privacy metric, one needs to carefully define a list of `important' parameters, basing importance both on how fundamental a parameter is (how many other parameters may be derived from it) and on home/business owners' expectations of privacy. • Summary: The list of important second-stage parameters form the evaluation criteria. Algorithms for estimating the parameters, along with the corresponding data requirements, provide a method for evaluating the sufficiency of available data. Together, these provide a metric for how much information may potentially be disclosed by a particular monitoring system.

  15. Outline • Introduction • Motivation • Summary of TRUST Efforts • Background • Brief Overview • Interested Parties • Abuse Cases • Privacy Metric • Experiment • Overview • Experimental Setup • Algorithms • Results

  16. Experiment: • Monitored a student residence continuously over a period of two-weeks. • Gathered electrical data from the breaker panel, visual data from a camera. • Camera logs included activities such as: • Turning household appliances on or off • Entering or leaving the residence • Sleeping • Preparing meals • Taking a bath

  17. Experimental Setup Floorplan Data Gathering Setup

  18. Setup Photos

  19. Algorithm: Details • Parameters to be estimated: • Presence/Absence, Number of Individuals • Appliance Usage • Sleep/wake cycle. • Miscellaneous Events - Breakfast, Dinner, Shower. • Sample Interval:

  20. Participant Privacy

  21. Evaluation Criteria • Compare behavior extraction results against reference results from camera data. Two Metrics: • Event based: • Define the cutoff threshold T_thresh • For each parameter, examine the sequence of turn-on/turn-off events on both the reference and estimated intervals. • If a camera event occurs but a corresponding electrical event does not occur within T_thresh seconds, declare a Failure to Detect. • If an electrical event occurs but a corresponding camera event does not occur within T_thresh seconds, declare a Misdetection. • Global Perspective: Compute correctly classified percentage of the reference interval.

  22. Algorithm: Implementation 1 • Accumulate Raw Data: • Find Switching Events:

  23. Algorithm: Implementation 2 • Match events to appliances: • Use heuristics to estimate parameters of interest:

  24. Results

  25. Performance • For the training data set, 101 of approximately 104 refrigerator events (more than 97%) were correctly classified. Results were similar (97%) for the experimental set.

  26. Outline • Introduction • Motivation • Summary of TRUST Efforts • Background • Brief Overview • Interested Parties • Abuse Cases • Privacy Metric • Experiment • Overview • Experimental Setup • Algorithms • Results • Discussion • Algorithm effectiveness • Privacy Implications

  27. Discussion • Our behavior extraction algorithm was a proof-of-concept. Future algorithms will show vast performance improvements. • Useful data can be extracted by less potent technology. • Hourly power averages such as the ones produced by California's AMI system may also be used to determine presence and sleep cycles, although to a coarser degree. Major appliances a large steady state power consumption (e.g. heat lamps) can also be identified. • Future concerns are not limited to the performance of these systems the level of on an individual household. • Algorithms are fully automated, so analysis may be done on a extremely large scales. • Easy access to such personal and demographic information will inevitably generate a market for it!

  28. Discussion (contd.) • Data data mining of hourly usage data by utilities be carefully monitored and regulated. • The authors of the report to the California Energy Commission advise that utilities should become subject to more stringent rules on the release and re-use of personal data as data mining practices develop and new information in which consumers have a reasonable expectation of privacy is exposed. • Our paper fleshes out the details of this recommendation: • Our discussion of interested entities and motivations shows that repurposing of consumption data creates real privacy concerns for the consumer, and by extension highlights the reasonable expectations of privacy that he or she should develop. • Our technical discussion and proof of concept demonstration shows what data mining may be capable of, illustrating the extent to which consumer privacy can be violated. • Finally, our privacy metric framework, in combination with the technical discussions, allows one to more precisely define the permitted and prohibited uses of data mining.

  29. Thank you for your time! • Questions?

  30. Conclusion Where, as here, the Government uses a device that is not in general public use, to explore details of the home that would previously have been unknowable without physical intrusion, the surveillance is a 'search' and is presumptively unreasonable without a warrant. -Justice Scalia, Kyllo v. US

More Related