1 / 43

Privacy, Nudges, and the Illusion of Control

This article explores the behavioral economics of privacy, including the illusion of control hypothesis and the use of soft paternalism and privacy nudges. It discusses the trade-offs involved in personal data protection and revelation, as well as the hurdles that impede privacy decision making. The article also highlights the need for experimental approaches to understand how cognitive and behavioral biases affect privacy decisions and inform policy and technology design.

nsilver
Download Presentation

Privacy, Nudges, and the Illusion of Control

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Privacy, Nudges, and the Illusion of Control Alessandro Acquisti Heinz College/CyLab Carnegie Mellon University K. U. Leuven - Interdisciplinary Privacy Course 2010 June 2010

  2. Overview From the economics to the behavioral economics of privacy The illusion of control hypothesis Soft paternalism and privacy nudges

  3. Overview From the economics to the behavioral economics of privacy The illusion of control hypothesis Soft paternalism and privacy nudges

  4. The economics of privacy • Protection & revelation of personal data flows involve tangible and intangible trade-offs for the data subject as well as the potential data holder • However….

  5. The need for a behavioral economics of privacy • The privacy paradox: privacy attitudes/behavior dichotomy • Hurdles which hamper (privacy) decision making • Incomplete information • Bounded rationality • Psychological/behavioral biases

  6. The need for a behavioral economics of privacy • Hence, the need for a behavioral, experimental economics of privacy (as well as information security) • I.e., applying theory and methodologies from BE and BDR to the understanding of how people (and organizations) make decisions about the security or privacy of their data • ... and how cognitive and behavioral biases (negatively) affect those decisions • … in order to inform policy and technology design

  7. Experimental approach • Randomized experiments • Randomly assigning subjects to different treatments (experimental conditions) • For instance, different versions of a survey • Numerous unobservable factors impact privacy concerns and privacy behavior • However: with large enough sample and proper randomization, underlying distributions of traits (including privacy preferences, concerns, and other factors which influence the former) are similar across conditions • Furthermore: control econometrically for other observable traits; avoid confounding effects • Testing for statistically significant differences in behavior (eg, propensity to answer questions) as function of treatment • Although we cannot interpret micro motivations (e.g., infer who is lying or why a subject is/is not answering), we can compare aggregate behaviors

  8. Some of our experiments • Hyperbolic discounting in privacy decision making (ACM EC 04) • Herding effects in information revelation (SJDM 2009) • Over-confidence, optimism bias in online social networks (WPES 05) • Confidentiality assurances inhibit information disclosure (SJDM 07) • Individuals more likely to disclose sensitive information to unprofessional sites than professional sites (SJDM 2007) • Endowment effects in privacy valuations (WISE 2009) • […]

  9. E.g.: Willingness to pay to protect privacy vs. willingness to accept to give data • Mall patrons asked to participated in a study. Offered compensation in the form of gift card(s) • We manipulated trade-offs between privacy protection and value of cards • Endowed with either: • $10 Anonymous gift card. “Your name will not be linked to the transactions completed with the card, and its usage will not be tracked by the researchers.” • $12 Trackable, identified gift card. “Your name will be linked to the transactions completed with the card, and its usage will be tracked by the researchers.” • Then, asked whether they’d like to switch cards • From $10 Anonymous to $12 Trackable (WTA) • From $12 Trackable to $10 Anonymous (WTP)

  10. WTP vs. WTA: Results χ2(3) = 30.66, p < 0.0005

  11. Overview From the economics to the behavioral economics of privacy The illusion of control hypothesis (joint work with Laura Brandimarte and George Loewenstein) Soft paternalism and privacy nudges

  12. The Illusion of control in information disclosure (or: the privacy control paradox) • Giving users more control over disclosure and publication of personal information paradoxically causes them to disclose more sensitive information and expose themselves to heightened privacy risks • Conjecture: Individuals may confound control over publication of private information with control over access/use of that information by others • Even though arguably threats to privacy derive from access to/use of available information by others! • Why? Because the act of publication is more salient than later access/use by others

  13. The Illusion of control in information disclosure (or: the privacy control paradox) • Privacy as control • Westin, Samarjiva, Culnan, Solove, … • Normative vs. Positive interpretation

  14. Hypotheses • Hypotheses: • Higher perceived control on publication will trigger higher willingness to reveal, even when the objective risks associated with accessibility/usage do not change, or in fact increase • Lower perceived control on publication will trigger lower willingness to reveal, even when the objective risks associated with accessibility/usage do not change, or in fact decrease • Illusion of control • Henslin (1967), Langer (1975)

  15. Three survey-based randomized experiments • Study 1: Reducing (perceived) control over publication of personal information • Mediated vs. unmediated publication • Study 2: Reducing (perceived) control over publication of personal information • Certainty vs. probability of publication • Study 3: Increasing (perceived) control over publication of personal information • Explicit vs. implicit control

  16. Study 1 • Design • Subjects: CMU students recruited on campus, March 2008 • Completed online survey • Justification for the survey: creation of CMU networking website • Questions focused on students’ life on and off campus • Multiple choice, Yes/No, Rating and open-end questions • Included quasi-identifiers + privacy intrusive and non-intrusive questions • As rated by 31 subjects independently in a pre-study

  17. Study 1 • Examplesofhighly intrusive questions • Emailaddress • Home address • Haveyouevercheatedforhomework/projects/exams (e.g. copy, plagiarize)? • Examplesofmoderately intrusive questions • Date of birth • Do youhave a girlfriend/boyfriend? • Haveyoueverhadtroubleswithyourroommates? • Examplesof non intrusive questions • Do you do any sport on campus? • Whichcourses are youtaking at the moment? • Howwouldyou rate the qualityof the educationyou are receiving?

  18. Study 1 • Manipulation: Profileautomaticallycreated vs. profile created by researcher (less control) • Controlgroup “No question/field is required. With the answers you provide, a profile will be automatically created for you, with no intervention by the researcher, and published on a new CMU networking website, which will only be accessible by members of the CMU community, starting from the end of April. The data will not be used in any other way.” • Treatment group “No question/field is required. The answers you provide will be collected by the researcher, who will create a profile for you and publish it on a new CMU networking website, which will only be accessible by members of the CMU community, starting from the end of April. The data will not be used in any other way.”

  19. Study 1 • Dependent variables • Response rate (whether subject answered or not) • Admission rate (whether subject admitted to some behaviors) • Explanatory variables • Treatment • Intrusiveness • Demographics (age, gender)

  20. Study 1 • Hypothesis: Loss ofcontroloverpublicationshoulddecreasewillingnesstodisclose private information, and especially so for the most sensitive questions • It is not the publication of private information per se that disturbs people, but the fact that someone else will publish it for them • Confounding factors

  21. Study 1 • Participants: 29 subjects in control condition, 32 subjects in treatment condition • 30 males (17 in controlcondition), 28 females (15 in controlcondition), 3 missing • Averageage: 21.8 in controlgroup, 21 in treatment group (differencenotsignificant)

  22. Study 1 Figure 1: Percentage of subjects answering each question in control and treatmentcondition

  23. Study 1 Table 1. RE Probit coefficients of panel regression of response rate on treatment with dummy for most intrusive questions, interaction and demographics * indicates significance at 10% level; ** indicates significance at 5% level

  24. Study 1 • Treatment has hypothesized effect on • 4 of the questions that were rated as highly intrusive (email, cheating at school, others cheating, informing instructor) • 1 moderately intrusive question (girlfriend) • Treatment did not push subjects to admit more: • The percentage of subjects answering “No” to questions about sensitive behaviors didn’t change significantly (10% level) between the control and the treatment conditions • However, possible confounding factor: trust in researcher

  25. Study 2 • Design • Similar to Study 1

  26. Study 2 • Manipulation: Profile automatically published vs. profile published with 50% probability (less control) • Control group “The information you provide will appear on a profile that will be automatically created for you. The profile will be published on a new CMU networking website, which will only be accessible by members of the CMU community, starting at the end of this semester. The data will not be used in any other way. NO QUESTION/FIELD REQUIRES AN ANSWER.” • Treatment group “The information you provide will appear on a profile that will be automatically created for you. Half of the profiles created for the participants will be randomly picked to be published on a new CMU networking website, which will only be accessible by members of the CMU community, starting at the end of this semester. The data will not be used in any other way. NO QUESTION/FIELD REQUIRES AN ANSWER.”

  27. Study 2 Figure 2: Percentage of subjects answering each question in control and treatmentcondition

  28. Study 2 Table2. RE Probit coefficients of panel regression of response rate on treatment with dummy for most intrusive questions, interaction and demographics * indicates significance at 10% level, ** indicates significance at 5% level;*** indicates significance at 1% level

  29. Study 2 • Possible confounding factors • Study 2 took care of one of the possible confounding factor in Study 1. However… • Subjects may reveal less because they care less, since the probability of publication is lower • If that were the case, we should observe an effect on those types of questions that required effort (program, courses). No such effect

  30. Study 3 • Design • Subjects: CMU students recruited on campus, March 2010 • Completed online survey • Justification for the survey: study on ethical behaviors • Ten Yes/No questions that focused on sensitive behaviors (e.g. druguse, stealing) • Includeddemographics + privacy intrusive and non-intrusivequestions • As ratedby 49 subjectsindependently in a pre-study

  31. Study 3 • Manipulations • Condition 1 (only implicit control) “All answers are voluntary. By answering a question, you agree to give the researchers permission to publish your answer.” • Condition 2 (high explicit control) “All answers are voluntary. In order to give the researchers permission to publish your answer to a question, you will be asked to check the corresponding box in the following page.” • Condition 3 (medium control) “All answers are voluntary. In order to give the researchers permission to publish your answers to the questions, you will be asked to check a box in the following page.” • Condition 4 (same as Condition 2, but the default is that answers will be published) “All answers are voluntary. In order to prevent the researchers from publishing your answer to a question, you will be asked to check the corresponding box in the following page.” • Condition 5 (some control + extra demographics) “All answers are voluntary. In order to give the researchers permission to publish your answers to the questions, you will be asked to check a box in the following page. Please notice that the answers to the demographic questions that you provided in the previous page will NOT be published without your explicit agreement: you will be asked permission to publish those answers separately.”

  32. Study 3

  33. Study 3 Table3. RE Probit coefficients of panel regression of response rate on treatment with dummy for most intrusive questions, interaction and demographics * indicates significance at 10% level; ** indicates significance at 5% level

  34. Study 3 • The coefficient on Treatment is always positive and significant: providing subjects with control over information publication increases their willingness to answer a question (results are similar if we only consider answers that subjects were willing to publish) • The coefficient on the interaction is only significant when comparing condition 1 with condition 2 • The negative coefficient on the interaction in condition 3 may be due to the very nature of the treatment: makes publication of very sensitive information more salient, but does not allow the prohibition of the publication of specific questions • Adding a dummy variable for the provision of an email address, which should have made subjects feel more identifiable, doesn’t affect our results

  35. Summarizing the results • Our results suggest the following: • Control over publication leads to more revelation of private info • This effect is stronger for privacy intrusive questions

  36. Implications • People seem to care more for control over publication of private information than for control over access and use of that information • When someone other tha n themselves is responsible for the publication, or when the publication itself becomes uncertain – which reduces the probability of access/use by others – people refrain from disclosing • Results call into questions OSNs’ arguments that privacy is protected by providing more control to members • Giving more control to users over information publication seems to generate higher willingness to disclose sensitive information

  37. Overview From the economics to the behavioral economics of privacy The illusion of control hypothesis Soft paternalism and privacy nudges

  38. Nudging users towards privacy • Our research highlights cognitive and behavioral biases that make it difficult for users to make the “right” privacy (and security) decision • However, those results can also used for “soft,” or asymmetric, paternalistic solutions: • Designing systems to “nudge” individuals, by anticipating – or even exploiting - the very fallacies and biases that research has uncovered; tweaking with their incentives, without diminish user’s freedom (IEEE S&P 2009)

  39. Soft vs. strong paternalism vs. usability • Consider online social networks users who post dates of birth online • Imagine that a study shows some risks associated with revealing DOBs (e.g., SSN predictions) • Strong paternalistic solution: ban public provision of dates of birth in online profiles • “Usability” solution : design a system to make it intuitive/ easy to change DOB visibility settings • Soft paternalistic solution?

  40. Nudging privacy through soft paternalism: some examples • Saliency of information • Provide context to aid the user’s decision - such as visually representing how many other users (or types of users) may be able to access that information • Default settings • By default, DOBs not visible, unless settings are modified by user • Hyperbolic discounting • Predict and show immediately SSN based on information provided • … and so forth

  41. For more info Google: economics privacy Visit: http://www.heinz.cmu.edu/~acquisti/economics-privacy.htm Email: acquisti@andrew.cmu.edu

  42. Backup Slides

  43. Study 3 Descriptive statistics and qualitative results

More Related