1 / 26

Security & Morality: A Tale of User Deceit

Security & Morality: A Tale of User Deceit. Models of Trust on the Web Edinburgh, UK May 2006 L. Jean Camp, C. McGrath, A. Genkina. Security & Morality: A Tale of User Deceit? . Hypotheses about human trust behavior developed from social science

jody
Download Presentation

Security & Morality: A Tale of User Deceit

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Security & Morality:A Tale of User Deceit Models of Trust on the Web Edinburgh, UK May 2006 L. Jean Camp, C. McGrath, A. Genkina Camp, McGrath, Genkina

  2. Security & Morality:A Tale of User Deceit? • Hypotheses about human trust behavior developed from social science • Compared with implicit assumptions in common technical mechanisms • Test computer-human trust behaviors • Conclude with guidance for trust design Camp, McGrath, Genkina

  3. Design for Trust Start with human trust behaviors Trust Used for simplification Encompasses discrete technical problems privacy, integrity, data security Embeds discrete policy problems business behavior, customer service, quality of goods, privacy Camp, McGrath, Genkina

  4. Human and Computer Trust Trust is approached differently by different disciplines Social Studies of Human Behavior Studies based on a micro approach Experiments to evaluate how people extend trust Game theory Common assumption: information exposure == trust Philosophy Macro approach Trust is a need high default to trust Trust is a tool for simplification Examine societies and cultural practices Camp, McGrath, Genkina

  5. Experimental Definition of Trust • Coleman’s Three Part Test • enables something not otherwise possible • individual who trusts is worse off if the trusted party acts in an untrustworthy manner • individuals who trust are better off if the trusted party acts in a trustworthy manner • there is no constraint placed on the trusted party • a time lag exists between a decision to trust and the outcome Camp, McGrath, Genkina

  6. Trust & Individiation • People interacting with a computer do not distinguish between computers as individuals but rather respond to their experience with "computers” • People begin too trusting • People learn to trust computers • first observed by Sproull on net in computer scientists in 1991 • confirmed by later experiments • Computers are perceived as moral agents • People will continue to extend trust - so creating another source of trust doesn’t defeat trusting behaviors Camp, McGrath, Genkina

  7. Research on Humans Suggest... • Humans may not differentiate between machines • Humans become more trusting of ‘the network’ • Humans begin with too much trust for computers • Confirmed by philosophical macro observation • Confirmed by computer security incidents • E-mail based Scams, Viruses & Hoaxes • Masquerade attacks Camp, McGrath, Genkina

  8. Three Hypotheses • Do humans respond differently to human or computer "betrayals" in terms of forgiveness? • Do people interacting with a computer distinguish between computers as individuals or respond to their experience with "computers”? • Does tendency to differentiate between remote machines increase with computer experience? Camp, McGrath, Genkina

  9. Security & Morality:A Tale of User Deceit • Hypotheses about human trust behavior developed from social science • Compared with implicit assumptions in common technical mechanisms • Test computer-human trust behaviors • Conclude with guidance for trust design Camp, McGrath, Genkina

  10. H1: Response to Failure • Do humans respond differently to human or computer "betrayals" in terms of forgiveness? • Attacks which are viewed as failures as ‘ignored’ or forgiven • Technical failures as seen as accidents rather than design decisions • May explain why people tolerate repeated security failures Camp, McGrath, Genkina

  11. H2: Differentiation • When people interact with networked computers, they discriminate among distinct computers (hosts, websites), treating them as distinct entities, particularly in their readiness to extend trust and secure themselves from possible harms. • People become more trusting over time • People differentiate more not less with experience • Do people learn to differentiate or trust? • “educate the user” may not work Camp, McGrath, Genkina

  12. Security & Morality:A Tale of User Deceit • Hypotheses about human trust behavior developed from social science • Compared with implicit assumptions in common technical mechanisms • Test computer-human trust behaviors • Conclude with guidance for trust design Camp, McGrath, Genkina

  13. The Experiment • Developed three websites • “life management” • Elephantmine.com • Reminders.name • MemoryMinder.us Camp, McGrath, Genkina

  14. Initial Tests • What information would you share with each site? • Do you trust the site? • user-defined trust, no macro definition given • Rejected MemoryMinders.us • people dislike lime green? • Other two designs had similar evaluations Camp, McGrath, Genkina

  15. Two “Betrayal” Types • One group faced a technical betrayal • Another person’s data is displayed • “John Q. Wilson” • DoB, Credit Card Number, social network data • One group faced a moral betrayal • Change in privacy policy announced • Collection of third party information correlated with compiled data • very common policy • eBay, Face Book, mySpace Camp, McGrath, Genkina

  16. Three Step Process • Users introduced to first site • Sites in the same order • Users experience betrayal • Half the users have technical failure • Half had privacy change • Both sets of users experience a failure upon departure of first site • Then users go to second site Camp, McGrath, Genkina

  17. Findings: Differentiation • Users respond to first site betrayal with significant change in behavior wrt second site • users had on average seven years experience with Internet • computer experience not at all significant • second site not seen as “new” entity • Cannot support the hypothesis that users differentiate • users do not enter each transaction with a new calculation of risk Camp, McGrath, Genkina

  18. Findings: Betrayal Type • Stronger reaction to privacy change • Yet technical failure indicated an inability to protect privacy Camp, McGrath, Genkina

  19. Security & Morality:A Tale of User Deceit • Hypotheses about human trust behavior developed from social science • Compared with implicit assumptions in common technical mechanisms • Test computer-human trust behaviors • Conclude with guidance for trust design Camp, McGrath, Genkina

  20. What To Conclude • Assuming the human will act like the computer has been a core design problem • Either remove assumptions about humans • Or computer security must be designed with social science in mind Camp, McGrath, Genkina

  21. Differentiation • The tendency to differentiate between remote machines decreases with computer experience • More use results in more lumping • Make better lumping • Explains common logon/passwords • along with cognitive limits • “My Internet is Down” • Need explicit DO NOT TRUST signals Camp, McGrath, Genkina

  22. Observations • Users are bad security managers • PGP, P3P, passwords, …. • Security should necessarily be a default • Surveys illustrate a continuing confusion of privacy & security • educate All Net Users OR • build upon the connection between the moral (privacy) and technical (security) Camp, McGrath, Genkina

  23. Computer security is built for machines • Passwords • Humans are a bad source of entropy • SSL • Two categories: secure and not secure • By requiring per-site differentiation does not enable human differentiation • Every site should include a unique graphic with the lock • Trust all machines with the lock • SSL - secured phishing has already occurred Camp, McGrath, Genkina

  24. PKI is built for Machines • Better lumping, not demands for user differentiation • Different levels of key revocation are needed • Falsified initial credential • All past transactions suspect • Change in status • Future transactions prohibited • Unrecognized hierarchy • Messages are confusing • No domain • No alert when moving to IP address space not connected to DNS Camp, McGrath, Genkina

  25. Building for Trust • Security technologies are not adopted • patching, PGP • Security technologies do not address user conceptions of trust • Patching • more secure machine with regular updates to Microsoft? • PGP • signed email w/o confidentiality to most people • Technologies linking security (competence) to privacy (beneficence) may prove more effective in trust building than security alone Camp, McGrath, Genkina

  26. Example Project • Focused on individuals • computer - computer trust • computer- human trust • Explicit “do not trust” signals Camp, McGrath, Genkina

More Related