1 / 30

Usability and Security

Usability and Security. Overview. Limits of Usability for Security 10 basic principles (Ka-Ping-Yee) Psychological foundations Key Management Bad Usability – or why Johnny can‘t encrypt User Intentions Authority Reduction Testing UIs. Limits of Usability.

ella
Download Presentation

Usability and Security

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Usability and Security

  2. Overview • Limits of Usability for Security • 10 basic principles (Ka-Ping-Yee) • Psychological foundations • Key Management • Bad Usability – or why Johnny can‘t encrypt • User Intentions • Authority Reduction • Testing UIs

  3. Limits of Usability • Usability cannot answer security questions which offer no real alternatives • Usability cannot reduce a systems ability to do damage • Pushing security problems to users is NOT usability!

  4. Intention: read THIS attachment, be responsive, do work Do you really want to read this mail from your boss?? User X Action: double-click Attachment Effect: execution Program Effect: System Calls with User X rights Operating System Objects from User X

  5. Basic Principles for Secure Interfaces • Path of Least Resistance: Match the most comfortable way to do tasks with the least granting of authority. • Active Authorization: Grant authority to others in accordance with user actions indicating consent. • Revocability: Offer the user ways to reduce others' authority to access the user's resources. • Visibility: Maintain accurate awareness of others' authority as relevant to user decisions. • Self-Awareness: Maintain accurate awareness of the user's own authority to access resources. • Trusted Path: Protect the user's channels to agents that manipulate authority on the user's behalf. • Expressiveness: Enable the user to express safe security policies in terms that fit the user's task. • Relevant Boundaries: Draw distinctions among objects and actions along boundaries relevant to the task. • Identifiability: Present objects and actions using distinguishable, truthful appearances. • Foresight: Indicate clearly the consequences of decisions that the user is expected to make. From Ka-Ping Yee, see ressources

  6. A Trusted Path Problem Really a system dialog? Do colors allow separation of system messages from foreign images? Does separation of location work? When do we know that a user did something vs. a program?

  7. Click-Jacking Do you want to …. (harmless text) Hidden: Install Program dialog or overwrite file dialog Cancel OK Application overlays system dialog System dialog buttons

  8. Psychological Foundations I • Confirmation Bias: Looking for evidence that confirms our assumptions. Ignoring counter-evidence and warnings. • Blind Spot Bias: Ignoring our own tendencies and prejudices. • Projection Bias: Assuming that everybody thinks like we do. Causes big surprises, e.g. when a „hacker“ shows a totally different way of using our interfaces • Personality Type Bias: Only about 7% of the population are technical „geeks“. But it is exactly this group that builds applications and interfaces for the general population. But due to the technical evolution even geeks become „normal“ people in most circumstances. • Pattern Recognition Bias: People tend to focus on recognizing simple patterns instead of doing complex analysis. Especially when under pressure • Subjective Validation Bias: e.g. Hypochonder believe that every illness they hear of is one they have. Something strange with your machine? Can only be a virus… After Gutmann, see ressources

  9. Psychological Foundations I • Rationalisation: We invent seemingly rational explanations even when we should perhaps recognize that something critical, not normal is going on. Explaining things away. • Emotional Bias: Emotions like depression or optimism dominate decisions. They can turn over logical considerations. • Inattentional Blindness: Not seeing things which are a little bit outside of our regular concentration. Like car drivers not seeing motorbike riders. This makes „Simon Says“ a bad strategy. • Zero-Risk-Bias in security experts: by only accepting 100% secure solutions (which are hardly ever possible) they tend to ignore security mechanisms which would work in most cases and are much simpler to implement. (Certificates vs. Local history) After Gutmann, see ressources

  10. Social Foundations • „forced“ trust within groups (exchange of passwords) • Social cohaesion through sharing secrets • Traditions, expectations etc. (politeness)

  11. „Simon Says“ The „Personal Message“ should work as a guarantee that the message is from VISA. It could still be a MIM attack but I doubt that most user will understand the importance of this field – or notice if it is missing!

  12. Key Management and Identities • Keeping Keys and Petnames constant • Offer only Petnames • Distribute Keys automatically • Use Proxies for key handling Remember: some security is better than none!

  13. Names and Keys Unforgeable Owner defined User defined But machines can help us and keep the relation petname/key constant!

  14. Trustbar shows known site

  15. Why Johnny Can‘t Encrypt • Most mail is sent unencrypted and unsigned • Few client certificates are registered • Many websites use unknown or self-signed certificates • Users do not understand the concepts behind Certificate Authorities • Certificate exchange before communication starts does not work • PKI with Certificate Authorities does not work for a user: what is a name? Renewal? Revoking? Key Management? • Most programs do not help with security ("Verified Visa" example) • Most programs lie to the user with respect to their security and privacy (lock icons, delete no real delete etc.) From: Simson Garfinkel's thesis on usable security. The book contains lots of empirical tests and data on the above issues.

  16. Usability Guidelines that can help Johnny: good security now • Create Keys When Needed: Ensure that cryptographic protocols that can use keys will haveaccess to keys, even if those keys were not signed by the private key of a well-known CertificateAuthority. • Key Continuity Management: Use digital certificates that are self-signed or signed by unknownCAs for some purpose that furthers secure usability, rather than ignoring them entirely. This,in turns, makes possible the use of automatically created self-signed certificates created by individualsor organizations that are unable or unwilling to obtain certificates from well-knownCertification Authorities. • Track Received Keys: Make it possible for the user to know if this is the first time that a keyhas been received, if the key has been used just a few times, or if it is used frequently. • Track Recipients: Ensure that cryptographically protected email can be appropriately processedby the intended recipient. • Migrate and Backup Keys: Prevent users from losing their valuable secret keys. From:[S.GARFINKEL, Thesis]

  17. Petnames allow the use of local (well known) names Local object names are well known too Communication history is known locally as well The history of a key Spatial order of local objects Private actions to cause some action Least Privilege principle used when commands are executed Getting to know somebody via friends Certificates which relate keys with non-unique names Certificate Authorities whose root certificates establish huge sets of trust relations. External judgements on sites via color coding or trust statements Signed Code to download software Programs which define the user interface and the names used (window titles etc.) Security Mediation: Local vs. Foreign

  18. A Proxy should automatically handle keys By automatically signing messages keys are distributed. Available keys are used for encryption as well. If a key changes a user will be notified. Can be used to implement peer-to-peer communication with Bob signing several keys and sending them to communicating partners.[S. GARFINKEL, Thesis]

  19. Capturing User Intentions

  20. intentions Program effects objects actions objects objects objects • Are the necessary actions available? • Is the path to actions and objects safe from impersonation? • Can the user understand the actions and their consequences? • Can users authorize actions granting minimum rights in a convenient way? • Can users transfer access granting minimum rights? • Does the program really do what the user wants?

  21. Intention: work on file „X“ Effects: Read/write(XCap.) Capability Desktop Action: start program with XCap. Program X Capability X Capability ONLY file X can be modified! Drag X file Icon to Program Icon Act of Designation: Select object and combine it with authority to work on it

  22. Bad Usability gives bad security

  23. Ambient Authority and Usability • Only when a system is able to reduce authority properly can usability increase security • A system without much ambient authority NEEDS good usability to transfer power from users to actors in a CONVENIENT and safe way!

  24. Authority Reduction Architecture Authority container for application with dialog option Per User static Access Rights (ACLs) Power Box Secure Desktop Modules Object Modules Modules Object Appli cation Designation of object and action by User (trusted path + authority by designation) Transformation of names to capabilities and creation of powerbox per application Granular distribution of authority (capabilities) to program modules Granular delegation of authority to single objects We need to narrow authority down from the global rights matrix (ACLs or Access Control Matrix) of a users rights to the minimum authority necessary to exectute a function. Test: try to find how many rights you REALLY need to copy a file!

  25. Testing the UI

  26. Program System Under Test (SUT) Test Results: Did User Understand and manage? Test Materials on Terms used, Actions to do etc. Test Results: Does the program do what it says?

  27. UI Design according to user centered design principles Create test materials for terms and actions Perform empirical tests Terms And Actions Test Materials Test Materials text Analyze test results and change design accordingly Create hypothesis about the understanding of terms and actions What does Term Y mean? Please use the program to perform the following task…

  28. Resources • [Gutm] Peter Gutmann, Security Usability Fundamentals, Draft 2008, http://www.cs.auckland.ac.nz/~pgut001/pubs/usability.pdf • [KS] W.Kriha, R.Schmitz, Usability und Security, in KES 03/2007, Zeitschrift des Bundesamtes für Sicherheit in der Informationstechnik, http://www.bsi.bund.de/literat/forumkes/kes0307.pdf • [GM] S. Garfinkel, R. Miller, Johnny 2: A User Test of Key Continuity Management with S/MIME and Outlook Express, presented at the Symposium on Usable Privacy and Security (SOUPS 2005), July 6-8, 2005, Pittsburgh, PA, online at http://www.simson.net/cv/pubs.php • [Sti1] M. Stiegler, An Introduction to Petname Systems, http://www.skyhunter.com/marcs/petnames/IntroPetNames.html • [Sti2] M. Stiegler, The Skynet Virus: Why it is unstoppable; How to stop it (Video), http://www.skyhunter.com/marcs/skynet.wmv • [VISA] Verified by Visa Konzept, siehe http://usa.visa.com/personal/security/ visa_security_program/vbv/how_it_works.html • [WT] A. Whitten, J. D. Tygar, Why Jonny Can't Encrypt, Kapitel 34 [CG] (ursprünglich in Proc. 8th USENIX Security Syposium, Washington, D.C., 1999. • [Yee02a] K. P. Yee, User Interaction Design for Secure Systems, Berkeley University Tech Report CSD-02-1184, 2002. Online at http://www.sims.berkeley.edu/~ping/sid/uidss-may-28.pdf.

More Related