html5-img
1 / 39

Usable Security – Are we nearly there yet?

Usable Security – Are we nearly there yet?. M. Angela Sasse Head of Information Security Research Director, Research Institute in Science of Cyber Security University College London, UK. History (Ancient). The system must be substantially, if not mathematically, undecipherable;

ramla
Download Presentation

Usable Security – Are we nearly there yet?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Usable Security –Are we nearly there yet? M. Angela Sasse Head of Information Security Research Director, Research Institute in Science of Cyber Security University College London, UK

  2. History (Ancient) • The system must be substantially, if not mathematically, undecipherable; • The system must not require secrecy and can be stolen by the enemy without causing trouble; • It must be easy to communicate and remember the keys without requiring written notes, it must also be easy to change or modify the keys with different participants; • The system ought to be compatible with telegraph communication; • The system must be portable, and its use must not require more than one person; • Finally, regarding the circumstances in which such system is applied, it must be easy to use and must neither require stress of mind nor the knowledge of a long series of rules. AugusteKerckhoffs, ‘La cryptographiemilitaire’, Journal des sciences militaires, vol. IX, pp. 5–38, Jan. 1883, pp. 161–191, Feb. 1883.

  3. History (Middle Ages) “It is essential that the human interface be designed for ease of use, so that users routinely and automatically apply the protection mechanisms correctly. Also, to the extent that the user’s mental image of his protection goals matches the mechanisms he must use, mistakes will be minimized.” J. H. Saltzer & M. D. Schroeder, ‘The protection of information in computer systems’ Proceedings of the IEEE, vol. 63, no. 9, pp. 1278-1308, Sept. 1975

  4. Study on escalating cost of password resets at BT too high workload leads to shortcut security mechanisms Users don’t understand threats and risks Also 1999: Whitten & Tygar“Why Johnny can’t encrypt” History (Recent) Adams & Sasse CACM 1999

  5. What Has Happened Over The Past Decade? • Lots, arguably: • ACM SOUPS (Symposium on Usable Security and Privacy) since 2004 • SHB (Security & Human Behaviour) since 2008 • Papers in CHI, CCS, Usenix, NSPW … • Books: Cranor & Garfinkel, Shostack, Lacey • University modules on usable security • US National Academy of Sciences Workshop on Usable Security and Privacy 2009

  6. And – is security more usable? • Exhibit 1: Authentication • Exhibit 2: Access Control • Exhibit 3: Encryption • Exhibit 4: CAPTCHAs

  7. Authentication • Lots of alternative authentication proposals • Mostly graphicall; example: Passfaces • Very memorable • … until you have more than one Passfaces password (Everitt et al., CHI 2009) • Selection biases result in low guessing difficulty

  8. Passpoints Wiedenbeck et al. IJHCS 2005

  9. Draw-a-Secret & BDAS Yan et. al

  10. More ‘usable’ authentication ... • Authentication via Rorschach inkblot tests • Singing your password (Reynaud et al., NSPW 2007) • Thinking your password - free EEG thrown in (Thorpe et al., NSPW 2005) – now possible with Emotiv helmet? • More biometrics (some dubious, some useful) • Ringing up your friends in the middle of the night to provide you with previously entrusted re-set codes (Microsoft)

  11. Passwords are plaguing people more than ever before • 6-8 passwords per employee within organisation, despite single sign-on (SSO) (Inglesant & Sasse, CHI 2010) • Getting worse: • Longer passwords • Increasing number of self-service re-sets, with • New layers of credentials added (e.g. challenge questions) • Interaction on new devices

  12. Old security+ new device = not usable, not secure • 50%+ of pw entries on touchscreens • entry time & errors 3-5X higher (Schaub et al., MUM 2012) •  severely reduced password space

  13. The Great Authentication Fatigue • More authentication than before, culminating in The Great Authentication Fatigue (Sasse et al., Procs HCII 2014) • Illustrated by NIST study: • 20+ authentications/day • 10% failure rate (with ensuing recovery activity) • Significant impact on individual and organisational productivity • Not just time spent on security task: cost of disruption

  14. Authentication ‘Wall of Disruption’

  15. Employees’ coping strategies • Batching and planning of activities to limit the number of logins • Storing passwords or writing them down

  16. Impact on productivity – long-term • User opt out of services, return devices • Improves their productivity, but often reduces organizational productivity (example: email) • Organization has less control over alternatives • Stifling innovation: new opportunities that would require changes in security • Staff leaving organization to be more productive/creative elsewhere

  17. Impact on security • User errors - even when they are trying to be secure • Coping strategies create vulnerabilities • Non-compliance/workarounds to get tasks done • ‘Noise' created by habitual non-compliance makes malicious behavior harder to detect • Lack of appreciation of/respect for security creates a dysfunctional security culture

  18. When breaking rules becomes the norm • People forget • High signal/noise ration, which makes hostile activity harder to detect

  19. High cost – and patchy security

  20. Password leak 1…

  21. … and Password Leak 2!

  22. Comp8 zombie – and why it could get worse • Comp8 password standard – usable for 1-2 password with frequent use, but dictionary + history checks & expiry create impossible workload • What is the security risk? Can be managed better without burdening users • Why it could get worse: on basis of mTurk study, Shay et al. (CHI 2014) argue that 12-15 char passwords are “usable”

  23. Security: “users should make the effort” “An hour from each of 180 million online users (in the US) is worth approximately $2.5 billion.A major error in security thinking has been to treat as free a resource that is actually extremely valuable. ” C Herley, More Is Not The Answer IEEE S&P Jan/Feb 2014

  24. “Technology should be smarter than this!” • Move from explicit to implicit authentication: • Proximity sensors to detect user presence • Behavioural biometrics: zero-effort, one-step, two-factor authentication • Exploit modality of interaction: use video-based authentication in video, audio in audio, etc. • Web fingerprinting can identify users – why not use it for good?

  25. Digital Natives are getting restless

  26. … and elephants are getting together

  27. Research green shoots: PICO • Cambridge University research project headed up by Frank Stajano • Aim: “To liberate computer users from the inconvenience and insecurity of passwords.” • Design directive. “You won't have to remember any secrets to authenticate.” • Method: moving from something you know (passwords) to something you have (wearable cryptographic technology) • See https://www.cl.cam.ac.uk/~fms27/pico/

  28. Exhibit 2: Access Control • Access control settings – RBAC, Sharepoint etc. • Widespread circumvention via emailing, password sharing, Dropbox (Bartsch & Sasse, ECIS 2013) • Over-entitlements: access reviews by managers – battle ground in many organisations • Green shoots: user self-reviews

  29. Exhibit 3: Encryption • Special Agent Johnny still can’t encrypt (Clark et al., USENIX 2011) • PKI-based solutions could fix many problems (e.g. phishing) but are too difficult for users, developers, and too expensive • Green shoots: Simply Secure foundation aiming to create usable encryption tools and blueprint for development process

  30. Exhibit 4: CAPTCHAs – making humans prove they are not bots • Not a particularly effective security measure • Not usable: failure rate around 40% - so customers go elsewhere • “CAPTCHAs waste 17 years of human effort every day” • (Pogue, Scientific American March 2012)

  31. Our non-compliance studies • Financial institution (8 interviews) • Technology company (9 interviews) • US government agency (24 interviews) • Utility company (118 int + 1200 survey responses) • Telco (98 ints + 600 survey responses) • UK defence (6 interviews with auditors) • Mechanisms: authentication, access control, USB, encryption, tokens/badges

  32. Stop obsessing about the UI – focus on economics! • Design failures are deeper than the UI – mis-alignment of goals and risk perceptions • Must account for the cost of security – accept there is a limited budget, and work with it • Need to focus on, and fit with, user goals and tasks • Fit least-disruptive mechanism – automate if possible

  33. The Compliance Budget (Beautement et al. NSPW 2008) Compliance Threshold Organisational cost of compliance Perceived indiv. cost exceeds perceived benefit Perceived individual cost of compliance

  34. Security: wake up and small the coffee “… security must make its way in an extremely competitive environment. Not only are there no un-claimed pools of user effort to be had, it is difficult to preserve existing pools from incursions. It is hard to reserve time, effort, screen real-estate or techniques for security when each of them is a valuable and monetizableresource.“ C. Herley: More Is Not the Answer IEEE Security & Privacy Jan/Feb 2014

  35. Conclusions • Are we nearly there yet? No – if anything, things have become worse • Need to minimise workload and friction of security in use, and model/predict it during the design stage • Radical thought: give security a budget (say - 3%) , and ask them to use it wisely

  36. Work in progress • Transforming existing deployments • Workload and friction audit, database • Use ‘Shadow Security’ practices as starting point for re-design (Kirlappos et al., 2014) • Transform security habits (Pfleeger et al. 2014) • During development • Use cases with personas and workload gauges (Sentire, Porter et al. RE 2014) • Assess enrolment tasks with NASA TLX with small user groups • Develop personas with specific demand thresholds

  37. Final appeal: End obstacle security

  38. PAS and engage and work with users, instead of patronising them

  39. Questions? http://xkcd.com/538/

More Related