1 / 38

Secure Interaction Design

Secure Interaction Design. Cynthia Kuo. Overview. Describe project on Wi-Fi access point configuration Show mockups and design process for Google Safe Browsing Talk about how you can design for security. Overview. Describe project on Wi-Fi access point configuration

hayes
Download Presentation

Secure Interaction Design

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Secure Interaction Design Cynthia Kuo

  2. Overview • Describe project on Wi-Fi access point configuration • Show mockups and design process for Google Safe Browsing • Talk about how you can design for security

  3. Overview • Describe project on Wi-Fi access point configuration • Show mockups and design process for Google Safe Browsing • Talk about how you can design for security

  4. Wi-Fi • Also known as 802.11 a/b/g • October 2006: 4 million units shipped each week

  5. Going Back a Few Years… Rough Estimate • Returns • ~30% return rate • Technical Support • 12 – 20 minutes / call • ~10% of sales  technical support call • $50 / hr technical support • 15 minute call = $12.50 • $10 materials  $2 profit / unit (assume 20%) • 1 call = profit from unit + 5 other units!

  6. Research Question • Why don’t users configure their wireless networks securely? • Cannot? • Choose not to? • Don’t know to?

  7. Traditional Solution • “Layer” different study techniques • Interviews: assess values, thought processes, and level of security knowledge • Surveys • Contextual inquiry: observe users • Usability study: evaluate features

  8. Why Not? • Evaluation of configuration process must be holistic • One user study method will not provide insight into entire process to pinpoint problems • Security is not a primary task • Takes a long time! • Number of qualified users may be small

  9. Designing a User Study • How do we evaluate a system where the end goal may be different for every user? • How can we ask about security concepts (e.g., encryption) if we don’t know whether users know what they are? • People know that they’re supposed to care about security. How do we design a study without social acceptability bias?

  10. Assumptions • Textbook study methods make assumptions that may not hold for security software

  11. Common Assumptions • Clear-cut criteria for success • Good security is risk management • Multiple ways to reach end result • No “undo” for some security breaches • Familiarity with underlying concepts • Task list may unintentionally provide information • Tasks are primary goals • No one wants to “do” security • Users respond without bias • Social acceptability biases Kuo, Perrig, and Walker, ACM <interactions>, May + June 2006

  12. Configuration Process

  13. What do people know about wireless security? What security issues do people care about? If users are aware of the security issues and care about them, are users able to configure the access points? Evaluation Methodology Evaluating the Whole Process Target Home User • Uses laptop as primary computer • Has broadband connectivity at home • Uses wireless on a daily basis (5+ times/week) Study Design • Interview (25 min) • Questionnaire (5 min) • Tasks (45 min) • Questionnaire (5 min) • Debriefing (10 min)

  14. Interview: Broadcasting?

  15. Availability Reliability Connection speed Ease of use Open networks Security Privacy Health Questionnaire • Opinions & concerns

  16. Experimental Setup • Gradual revelation • User task • Set up access point for home • Explain motivation & understanding of possible consequences Scenario Okay, let’s pretend you just received this 802.11 access point as a gift. You would like to set up and use a wireless network at home today. Just set up the access point as you would if you were at home.

  17. Findings • Users are reasonably knowledgeable about wireless technologies • …but have difficulty translating that knowledge into security policies and feature configurations • Novice users perform significantly worse than expert users • Expanding market  novice users

  18. What Does that Mean for Products?

  19. Goal-Based Design • Can “level the playing field” between novice and expert users • Start from human goals, not technical features • Do not assume people are familiar with technical terms or particular technologies • Anticipate common error states • Minimize time & human effort required

  20. Prototype Design

  21. Results

  22. Lessons • More than one user study method may be needed to evaluate your problem • Watch out for assumptions in your user study methods • Adapt existing methods for your needs

  23. Overview • Describe project on Wi-Fi access point configuration • Show mockups and design process for Google Safe Browsing • Talk about how you can design for security

  24. Google Safe Browsing • Anti-phishing alert • Part of Google Toolbar for Firefox • http://www.google.com/tools/firefox/safebrowsing/index.html

  25. Maps Bubble • Warning bubble and icon used to appear trustworthy • Gray background to emphasize danger and to catch attention • Bubble attached to browser chrome to convey message origin • Active elements on page disabled

  26. Lessons • Establish trustworthiness of message • Origin • Authority • Match intrusiveness to severity • No false positives • Recommend what actions to take • Provide a feeling of closure

  27. Overview • Describe project on Wi-Fi access point configuration • Show mockups and design process for Google Safe Browsing • Talk about how you can design for security

  28. Design for Security • Think like your user • Use personas • Stop thinking like yourself • Design for your personas • User test, user test, user test • Watch your users • Don’t always believe what they say

  29. Think Like Your User • Personas • A precise description of your user and what s/he wants to accomplish • Make up archetypical users • More specific is better! • Design for these users • You may have primary and secondary personas • 3 - 12 Cooper (1999)

  30. Example Persona Dan is a 46-year old sales executive for a sports magazine. He has never heard of encryption, Diffie-Hellman, or EKE. Dan sent 38 emails from his Blackberry 8700c yesterday. He travels 50% of the time to meet with clients all over the East Coast. Using his IBM T41 laptop, he checks his email from different hotels – he prefers Wyndham - every night. Dan often needs to download sensitive documents that contain his company’s business strategies. After 10 hours of meetings during the day, Dan does not want to spend any time configuring anything. Dan likes to play basketball in his spare time. Dan

  31. Stop Thinking Like Yourself • You are probably not the typical user • Your user does not think like you • Your user probably does not know as much as you do (about security in general and especially your product) • Your user is not dumb, but will almost always make mistakes

  32. “The user might want to disable L2TP Passthrough.” Common Mistake #1:Thinking Like an Engineer No! Dan doesn’t know what L2TP is - and he doesn’t ever want to.

  33. Users’ Goals Not feel stupid Not make mistakes Get work done Have fun (or at least not be too bored) False Goals Save memory Run in a browser Safeguard data integrity Increase program-execution efficiency Use cool technology or features Common Mistake #2:Focusing on Tasks & Features, Not Goals Cooper, Alan. The Inmates are Running the Asylum. Sams, 1999.

  34. Software Evaluation • Inexpensive, “discount” methods • Low-fidelity • Cognitive walkthrough • Heuristic evaluation • Expensive • Formal models (e.g., GOMS) • Formal experiment

  35. Discount Methods: Predictive? # Problems that Did Occur # Problems that Could Potentially Occur 25 29 Lab Heuristic Evaluation Experts System Designers Non-experts 11 (44%) 4 (16%) 2 (8%) 9 (31%) 7 (24%) 1 (3%) Cognitive Walkthrough Experts System Designers Non-experts 7 (28%) 4 (16%) 2 (8%) 9 (31%) 6 (21%) 2 (7%) Desurvire, Kondziela, Atwood (1992)

  36. Common Mistake #3:Listening to One Person • “A customer said we should…” • 80% rule • Feature creep

  37. Lessons • Think like your user • Stop thinking like yourself • User test, user test, user test • Be careful about what information you use

  38. Thank you! • Questions? Comments? • cykuo@cmu.edu

More Related