1 / 67

A case study in UI design and evaluation for computer security

A case study in UI design and evaluation for computer security. Rob Reeder January 30, 2008. Memogate : A user interface scandal !!. Overview. Task domain: Windows XP file permissions Design of two user interfaces: native XP interface, Salmon Evaluation: Which interface was better?

jolene
Download Presentation

A case study in UI design and evaluation for computer security

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A case study in UI design and evaluation for computer security Rob Reeder January 30, 2008

  2. Memogate: A user interface scandal !!

  3. Overview • Task domain: Windows XP file permissions • Design of two user interfaces: native XP interface, Salmon • Evaluation: Which interface was better? • Analysis: Why was one better?

  4. Part 1: File permissions in Windows XP • File permissions task: Allow authorized users access to resources, deny unauthorized users access to resources • Resources: Files and folders • Users: People with accounts on the system • Access: 13 types, such as Read Data, Write Data, Execute, Delete

  5. Challenges for file permissions UI design • Maybe thousands of users – impossible to set permissions individually for each • Thirteen access types – hard for a person to remember them all

  6. Grouping to handle users • Administrators • Power Users • Everyone • Admin-defined

  7. A problematic user grouping Xu Ari Miguel Bill Yasir Cindy Zack Group A Group B

  8. Precedence rules • No setting = Deny by default • Allow > No setting • Deny > Allow • (> means “takes precedence over”)

  9. Grouping to handle access types Execute 9

  10. Moral • Setting file permissions is quite complicated • But a good interface design can help!

  11. The XP file permissions interface

  12. ProjectF The Salmon interface 12

  13. Expandable Grid 13

  14. Example task: Wesley • Initial state • Wesley allowed READ & WRITE from a group • Final state • Wesley allowed READ, denied WRITE • What needs to be done • Deny Wesley WRITE

  15. What’s so hard? • Conceptually: Nothing! • Pragmatically: • User doesn’t know initial group membership • Not clear what changes need to be made • Checking work is hard

  16. Learning Wesley’s initial permissions 1 2 Click “Effective Permissions” Click “Advanced” 3 4 Select Wesley View Wesley’s Effective Permissions 16

  17. Learning Wesley’s group membership Bring up Computer Management interface 5 6 Click on “Users” 7 Read Wesley’s group membership Double-click Wesley Click “Member Of” 8 9 17

  18. Changing Wesley’s permissions 10 11 Deny Write Click “Add…” 12 Click “Apply” 18

  19. Checking work 13 14 Click “Effective Permissions” Click “Advanced” 15 16 Select Wesley View Wesley’s Effective Permissions 19

  20. XP file permissions interface: Poor 20

  21. Part 2: Common security UI design problems • Poor feedback • Ambiguous labels • Violation of conventions • Hidden options • Omission errors

  22. Problem #1: Poor feedback 1 2 Click “Effective Permissions” Click “Advanced” 3 4 Select Wesley View Wesley’s Effective Permissions 22

  23. ProjectF Salmon: immediate feedback 23

  24. Grid: consolidated feedback 24

  25. Full Control Modify Read & Execute Read Write Special Permissions Problem #2: Labels (1/3) 25

  26. Full Control Traverse Folder/Execute File List Folder/Read Data Read Attributes Read Extended Attributes Create Files/Write Data Create Folders/Append Data Write Attributes Write Extended Attributes Delete Read Permissions Change Permissions Take Ownership Problem #2: Labels (2/3) 26

  27. ProjectF Salmon: clearer labels 27

  28. Grid: fewer, clearer labels

  29. Problem #3: Violating interface conventions 29

  30. Problem #3: Violating interface conventions 30

  31. ProjectF Salmon: better checkboxes 31

  32. Grid: direct manipulation 32

  33. Problem #4: Hidden options

  34. Problem #4: Hidden options 1 2 Double-click entry Click “Advanced” 3 Click “Delete” checkbox

  35. ProjectF Salmon: All options visible 35

  36. Grid: Even more visibility 36

  37. Problem #5: Omission errors 37

  38. ProjectF Salmon: Feedback helps prevent omission errors 38

  39. Grid: No omission errors 39

  40. FLOCK: Summary of design problems • Feedback poor • Labels ambiguous • Omission error potential • Convention violation • Keeping options visible

  41. Part 3: Evaluation of XP and Salmon • Conducted laboratory-based user studies • Formative and summative studies for Salmon • I’ll focus on summative evaluation

  42. Advice for user studies • Know what you’re measuring! • Maintain internal validity • Maintain external validity

  43. Common usable security metrics • Accuracy – with what probability do users correctly complete tasks? • Speed – how quickly can users complete tasks? • Security – how difficult is it for an attacker to break into the system? • Etc. – satisfaction, learnability, memorability

  44. Measure the right things! • Speed is often useless without accuracy (e.g., setting file permissions) • Accuracy may be useless without security (e.g., easy-to-remember passwords)

  45. Measurement instruments • Speed – Easy; use a stopwatch, time users • Accuracy – Harder; need unambiguous definitions of “success” and “failure” • Security – Very hard; may require serious math, or lots of hackers

  46. Internal validity • Internal validity: Making sure your results are due to the effect you are testing • Manipulate one variable (in our case, the interface, XP or Salmon) • Control or randomize other variables • Use same experimenter • Experimenter reads directions from a script • Tasks presented in same text to all users • Assign tasks in different order for each user • Assign users randomly to one condition or other 46

  47. External validity • External validity: Making sure your experiment can be generalized to the real world • Choose real tasks • Sources of real tasks: • Web forums • Surveys • Your own experience • Choose real participants • We were testing novice or occasional file-permissions users with technical backgrounds (so CMU students & staff fit the bill)

  48. User study compared Salmon to XP • Seven permissions-setting tasks, I’ll discuss two: • Wesley • Jack • Metrics for comparison: • Accuracy (measured as deviations in users’ final permission bits from correct permission bits) • Speed (time to task completion) • Not security – left that to Microsoft

  49. Study design • Between-participants comparison of interfaces • 12 participants per interface, 24 total • Participants were technical staff and students at Carnegie Mellon University • Participants were novice or occasional file permissions users

  50. Wesley and Jack tasks Wesley task Jack task • Initial state • Wesley allowed READ & WRITE • Final state • Wesley allowed READ, denied WRITE • What needs to be done • Deny Wesley WRITE • Initial state • Jack allowed READ, WRITE, & ADMINISTRATE • Final state • Jack allowed READ, denied WRITE & ADMINISTRATE • What needs to be done • Deny Jack WRITE & ADMINISTRATE

More Related