security morality a tale of user deceit
Download
Skip this Video
Download Presentation
Security & Morality: A Tale of User Deceit

Loading in 2 Seconds...

play fullscreen
1 / 26

Security & Morality: A Tale of User Deceit - PowerPoint PPT Presentation


  • 94 Views
  • Uploaded on

Security & Morality: A Tale of User Deceit. Models of Trust on the Web Edinburgh, UK May 2006 L. Jean Camp, C. McGrath, A. Genkina. Security & Morality: A Tale of User Deceit? . Hypotheses about human trust behavior developed from social science

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Security & Morality: A Tale of User Deceit' - jody


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
security morality a tale of user deceit
Security & Morality:A Tale of User Deceit

Models of Trust on the Web

Edinburgh, UK

May 2006

L. Jean Camp, C. McGrath, A. Genkina

Camp, McGrath, Genkina

security morality a tale of user deceit1
Security & Morality:A Tale of User Deceit?
  • Hypotheses about human trust behavior developed from social science
  • Compared with implicit assumptions in common technical mechanisms
  • Test computer-human trust behaviors
  • Conclude with guidance for trust design

Camp, McGrath, Genkina

design for trust
Design for Trust

Start with human trust behaviors

Trust

Used for simplification

Encompasses discrete technical problems

privacy, integrity, data security

Embeds discrete policy problems

business behavior, customer service, quality of goods, privacy

Camp, McGrath, Genkina

human and computer trust

Human and Computer Trust

Trust is approached differently by different disciplines

Social Studies of Human Behavior

Studies based on a micro approach

Experiments to evaluate how people extend trust

Game theory

Common assumption: information exposure == trust

Philosophy

Macro approach

Trust is a need

high default to trust

Trust is a tool for simplification

Examine societies and cultural practices

Camp, McGrath, Genkina

experimental definition of trust
Experimental Definition of Trust
  • Coleman’s Three Part Test
    • enables something not otherwise possible
      • individual who trusts is worse off if the trusted party acts in an untrustworthy manner
      • individuals who trust are better off if the trusted party acts in a trustworthy manner
    • there is no constraint placed on the trusted party
    • a time lag exists between a decision to trust and the outcome

Camp, McGrath, Genkina

trust individiation
Trust & Individiation
  • People interacting with a computer do not distinguish between computers as individuals but rather respond to their experience with "computers”
    • People begin too trusting
    • People learn to trust computers
      • first observed by Sproull on net in computer scientists in 1991
      • confirmed by later experiments
    • Computers are perceived as moral agents
  • People will continue to extend trust - so creating another source of trust doesn’t defeat trusting behaviors

Camp, McGrath, Genkina

research on humans suggest
Research on Humans Suggest...
  • Humans may not differentiate between machines
  • Humans become more trusting of ‘the network’
  • Humans begin with too much trust for computers
    • Confirmed by philosophical macro observation
    • Confirmed by computer security incidents
      • E-mail based Scams, Viruses & Hoaxes
      • Masquerade attacks

Camp, McGrath, Genkina

three hypotheses
Three Hypotheses
  • Do humans respond differently to human or computer "betrayals" in terms of forgiveness?
  • Do people interacting with a computer distinguish between computers as individuals or respond to their experience with "computers”?
    • Does tendency to differentiate between remote machines increase with computer experience?

Camp, McGrath, Genkina

security morality a tale of user deceit2
Security & Morality:A Tale of User Deceit
  • Hypotheses about human trust behavior developed from social science
  • Compared with implicit assumptions in common technical mechanisms
  • Test computer-human trust behaviors
  • Conclude with guidance for trust design

Camp, McGrath, Genkina

h1 response to failure
H1: Response to Failure
  • Do humans respond differently to human or computer "betrayals" in terms of forgiveness?
    • Attacks which are viewed as failures as ‘ignored’ or forgiven
    • Technical failures as seen as accidents rather than design decisions
      • May explain why people tolerate repeated security failures

Camp, McGrath, Genkina

h2 differentiation
H2: Differentiation
  • When people interact with networked computers, they discriminate among distinct computers (hosts, websites), treating them as distinct entities, particularly in their readiness to extend trust and secure themselves from possible harms.
      • People become more trusting over time
      • People differentiate more not less with experience
      • Do people learn to differentiate or trust?
        • “educate the user” may not work

Camp, McGrath, Genkina

security morality a tale of user deceit3
Security & Morality:A Tale of User Deceit
  • Hypotheses about human trust behavior developed from social science
  • Compared with implicit assumptions in common technical mechanisms
  • Test computer-human trust behaviors
  • Conclude with guidance for trust design

Camp, McGrath, Genkina

the experiment
The Experiment
  • Developed three websites
    • “life management”
      • Elephantmine.com
      • Reminders.name
      • MemoryMinder.us

Camp, McGrath, Genkina

initial tests
Initial Tests
  • What information would you share with each site?
  • Do you trust the site?
      • user-defined trust, no macro definition given
  • Rejected MemoryMinders.us
      • people dislike lime green?
  • Other two designs had similar evaluations

Camp, McGrath, Genkina

two betrayal types
Two “Betrayal” Types
  • One group faced a technical betrayal
    • Another person’s data is displayed
    • “John Q. Wilson”
    • DoB, Credit Card Number, social network data
  • One group faced a moral betrayal
    • Change in privacy policy announced
    • Collection of third party information correlated with compiled data
      • very common policy
      • eBay, Face Book, mySpace

Camp, McGrath, Genkina

three step process
Three Step Process
  • Users introduced to first site
    • Sites in the same order
  • Users experience betrayal
    • Half the users have technical failure
    • Half had privacy change
    • Both sets of users experience a failure upon departure of first site
  • Then users go to second site

Camp, McGrath, Genkina

findings differentiation
Findings: Differentiation
  • Users respond to first site betrayal with significant change in behavior wrt second site
    • users had on average seven years experience with Internet
    • computer experience not at all significant
    • second site not seen as “new” entity
  • Cannot support the hypothesis that users differentiate
    • users do not enter each transaction with a new calculation of risk

Camp, McGrath, Genkina

findings betrayal type
Findings: Betrayal Type
  • Stronger reaction to privacy change
    • Yet technical failure indicated an inability to protect privacy

Camp, McGrath, Genkina

security morality a tale of user deceit4
Security & Morality:A Tale of User Deceit
  • Hypotheses about human trust behavior developed from social science
  • Compared with implicit assumptions in common technical mechanisms
  • Test computer-human trust behaviors
  • Conclude with guidance for trust design

Camp, McGrath, Genkina

what to conclude
What To Conclude
  • Assuming the human will act like the computer has been a core design problem
  • Either remove assumptions about humans
  • Or computer security must be designed with social science in mind

Camp, McGrath, Genkina

differentiation
Differentiation
  • The tendency to differentiate between remote machines decreases with computer experience
    • More use results in more lumping
      • Make better lumping
    • Explains common logon/passwords
      • along with cognitive limits
      • “My Internet is Down”
  • Need explicit DO NOT TRUST signals

Camp, McGrath, Genkina

observations
Observations
  • Users are bad security managers
    • PGP, P3P, passwords, ….
  • Security should necessarily be a default
  • Surveys illustrate a continuing confusion of privacy & security
    • educate All Net Users OR
    • build upon the connection between the moral (privacy) and technical (security)

Camp, McGrath, Genkina

computer security is built for machines
Computer security is built for machines
  • Passwords
    • Humans are a bad source of entropy
  • SSL
    • Two categories: secure and not secure
      • By requiring per-site differentiation does not enable human differentiation
      • Every site should include a unique graphic with the lock
    • Trust all machines with the lock
    • SSL - secured phishing has already occurred

Camp, McGrath, Genkina

pki is built for machines
PKI is built for Machines
  • Better lumping, not demands for user differentiation
    • Different levels of key revocation are needed
      • Falsified initial credential
        • All past transactions suspect
      • Change in status
        • Future transactions prohibited
      • Unrecognized hierarchy
        • Messages are confusing
      • No domain
        • No alert when moving to IP address space not connected to DNS

Camp, McGrath, Genkina

building for trust
Building for Trust
  • Security technologies are not adopted
    • patching, PGP
  • Security technologies do not address user conceptions of trust
    • Patching
      • more secure machine with regular updates to Microsoft?
    • PGP
      • signed email w/o confidentiality to most people
  • Technologies linking security (competence) to privacy (beneficence) may prove more effective in trust building than security alone

Camp, McGrath, Genkina

example project
Example Project
  • Focused on individuals
    • computer - computer trust
    • computer- human trust
  • Explicit “do not trust” signals

Camp, McGrath, Genkina

ad