1 / 77

EE515/IS523 Think Like an Adversary Lecture 7 Usability /Software Failures

This lecture discusses the concepts of identification, authentication, and authorization, and the building blocks of authentication. It also explores various authentication mechanisms, their evaluation, and common problems with passwords. Additionally, alternative authentication methods such as mnemonic passwords, password keeper software, biometrics, graphical passwords, and browser-based mutual authentication are examined.

hilliardd
Download Presentation

EE515/IS523 Think Like an Adversary Lecture 7 Usability /Software Failures

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. EE515/IS523 Think Like an AdversaryLecture 7Usability/Software Failures Yongdae Kim

  2. Recap • http://security101.kr • E-mail policy • Include [ee515] or [is523] in the subject of your e-mail • Student Survey • http://bit.ly/SiK9M3 • Student Presentation • Send me email. • Preproposal meeting: Today after class

  3. Definitions • Identification - a claim about identity • Who or what I am (global or local) • Authentication - confirming that claims are true • I am who I say I am • I have a valid credential • Authorization - granting permission based on a valid claim • Now that I have been validated, I am allowed to access certain resources or take certain actions • Access control system - a system that authenticates users and gives them access to resources based on their authorizations • Includes or relies upon an authentication mechanism • May include the ability to grant course or fine-grained authorizations, revoke or delegate authorizations • Also includes an interface for policy configuration and management

  4. Building blocks of authentication • Factors • Something you know (or recognize) • Something you have • Something you are • Two factors are better than one • Especially two factors from different categories • What are some examples of each of these factors? • What are some examples of two-factor authentication?

  5. Authentication mechanisms • Text-based passwords • Graphical passwords • Hardware tokens • Public key crypto protocols • Biometrics

  6. Evaluation • Accessibility • Memorability • Security • Cost • Environmental considerations

  7. Typical password advice

  8. Typical password advice • Pick a hard to guess password • Don’t use it anywhere else • Change it often • Don’t write it down So what do you do when every web site you visit asks for a password?

  9. Bank = b3aYZ Amazon = aa66x! Phonebill = p$2$ta1

  10. Problems with Passwords • Selection • Difficult to think of a good password • Passwords people think of first are easy to guess • Memorability • Easy to forget passwords that aren’t frequently used • Difficult to remember “secure” passwords with a mix of upper & lower case letters, numbers, and special characters • Reuse • Too many passwords to remember • A previously used password is memorable • Sharing • Often unintentional through reuse • Systems aren’t designed to support the way people work together and share information

  11. Substitute numbers for words or similar-looking letters fsasya,oF Substitute symbols for words or similar-looking letters 4sa7ya,oF Mnemonic Passwords Four F Four and and a , , score s y years seven s seven a ago o our F Fathers First letter of each word (with punctuation) 4sa7ya,oF 4sasya,oF 4s&7ya,oF Source: Cynthia Kuo, SOUPS 2006

  12. The Promise? • Phrases help users incorporate different character classes in passwords • Easier to think of character-for-word substitutions • Virtually infinite number of phrases • Dictionaries do not contain mnemonics Source: Cynthia Kuo, SOUPS 2006

  13. Mnemonic password evaluation • Mnemonic passwords are not a panacea for password creation • No comprehensive dictionary today • May become more vulnerable in future • Many people start to use them • Attackers incentivized to build dictionaries • Publicly available phrases should be avoided! Source: Cynthia Kuo, SOUPS 2006

  14. Password keeper software • Run on PC or handheld • Only remember one password

  15. Single sign-on • Login once to get access to all your passwords

  16. Biometrics

  17. Fingerprint Spoofing • Devices • Microsoft Fingerprint Reader • APC Biometric Security device • Success! • Very soft piece of wax flattened against hard surface • Press the finger to be molded for 5 minutes • Transfer wax to freezer for 10-15 minutes • Firmly press modeling material into cast • Press against the fingerprint reader • Replicated several times

  18. Retina/Iris Scan • Retinal Scan • Must be close to camera (IR) • Scanning can be invasive • Not User friendly • Expensive • Iris Scan • Late to the game • Requires advanced technology to properly capture iris • Users do not have to consent to have their identity tested

  19. Graphical passwords

  20. “Forgotten password” mechanism • Email password or magic URL to address on file • Challenge questions • Why not make this the normal way to access infrequently used sites?

  21. Convenient SecureID 1 • What problems does this approach solve? • What problems does it create? Source: http://worsethanfailure.com/Articles/Security_by_Oblivity.aspx

  22. Convenient SecureID 2 • What problems does this approach solve? • What problems does is create? Previously available at: http://fob.webhop.net/

  23. Browser-based mutual authentication • Chris Drake’s “Magic Bullet” proposal • http://lists.w3.org/Archives/Public/public-usable-authentication/2007Mar/0004.html • User gets ID, password (or alternative), image, hotspot at enrollment • Before user is allowed to login they are asked to confirm URL and SSL cert and click buttons • Then login box appears and user enters username and password (or alternative) • Server displays set of images, including user’s image (or if user entered incorrect password, random set of images appear) • User finds their image and clicks on hotspot • Image manipulation can help prevent replay attacks • What problems does this solve? • What problems doesn’t it solve? • What kind of testing is needed

  24. Phishing

  25. Spear Phishing (Targeted Phishing) • Personalized mail for a (small) group of targeted users • Employees, Facebook friends, Alumni, eCommerce Customers • These groups can be obtained through identity theft! • Content of the email is personalized. • Different from Viagra phishing/spam • Combined with other attacks • Zero-day vulnerability: unpatched • Rootkit: Below OS kernel, impossible to detect with AV software • Key logger: Further obtain ID/password • APT (Advanced Persistent Threat): long-term surveillance

  26. Examples of Spear Phishing

  27. Good Phishing example

  28. Policy and Usability

  29. Cost of Reading Policy Cranor et al. • TR= p x R x n • p is the population of all Internet users • R is the average time to read one policy • n is the average number of unique sites Internet users visit annually • p = 221 million Americans online (Nielsen, May 2008) • R = avg time to read a policy = # words in policy / reading rate • To estimate words per policy: • Measured the policy length of the 75 most visited websites • Reflects policies people are most likely to visit • Reading rate = 250 WPM Mid estimate: 2,514 words / 250 WPM = 10 minutes

  30. n = number of unique sites per year • Nielsen estimates Americans visit 185 unique sites in a month: • but that doesn’t quite scale x12, so 1462 unique sites per year. • TR= p x R x n = 221 million x 10 minutes x 1462 sites • R x n = 244 hours per year per person

  31. P3P: Platform for Privacy Preferences • A framework for automated privacy discussions • Web sites disclose their privacy practices in standard machine-readable formats • Web browsers automatically retrieve P3P privacy policies and compare them to users’ privacy preferences • Sites and browsers can then negotiate about privacy terms

  32. Why Johnny Can’t Encrypt- A Usability Evaluation of PGP 5.0- Alma Whitten and J.D. Tygar Usenix Sec’99 Presented by Yongdae Kim Some of the Slides borrowed from Jeremy Hyland

  33. Defining Usable Security Software • Security software is usable if the people who are expected to use it: • are reliably made aware of the security tasks they need to perform. • are able to figure out how to successfully perform those tasks • don't make dangerous errors • are sufficiently comfortable with the interface to continue using it.

  34. Why is usable security hard? • The unmotivated users • “Security is usually a secondary goal” • Policy Abstraction • Programmers understand the representation but normal users have no background knowledge. • The lack of feedback • We can’t predict every situation. • The proverbial “barn door” • Need to focus on error prevention. • The weakest link • Attacker only needs to find one vulnerability

  35. Why Johnny can’t encrypt? • PGP 5.0 • Pretty Good Privacy • Software for encrypting and signing data • Plug-in provides “easy” use with email clients • Modern GUI, well designed by most standards • Usability Evaluation following their definition If an average user of email feels the need for privacy and authentication, and acquires PGP with that purpose in mind, will PGP's current design allow that person to realize what needs to be done, figure out how to do it, and avoid dangerous errors, without becoming so frustrated that he or she decides to give up on using PGP after all?

  36. Usability Evaluation Methods • Cognitive walk through • Mentally step through the software as if we were a new user. Attempt to identify the usability pitfalls. • Focus on interface learnablity. • Results

  37. Cognitive Walk Through Results • Irreversible actions • Need to prevent costly errors • Consistency • Status message: “Encoding”?!? • Too much information • More unneeded confusion • Show the basic information, make more advanced information available only when needed.

  38. User Test • User Test • PGP 5.0 with Eudora • 12 participants all with at least some college and none with advanced knowledge of encryption • Participants were given a scenario with tasks to complete within 90 min • Tasks built on each other • Participants could ask some questions through email

  39. User Test Results • 3 users accidentally sent the message in clear text • 7 users used their public key to encrypt and only 2 of the 7 figured out how to correct the problem • Only 2 users were able to decrypt without problems • Only 1 user figured out how to deal with RSA keys correctly. • A total of 3 users were able to successfully complete the basic process of sending and receiving encrypted emails. • One user was not able to encrypt at all

  40. Conclusion • Reminder If an average user of email feels the need for privacy and authentication, and acquires PGP with that purpose in mind, will PGP's current design allow that person to realize what needs to be done, figure out how to do it, and avoid dangerous errors, without becoming so frustrated that he or she decides to give up on using PGP after all? • Is this a failure in the design of the PGP 5.0 interface or is it a function of the problem of traditional usable design vs. design for usable secure systems? • What other issues? What kind of similar security issues? What do we learn from this paper?

  41. Analysis of an Electronic Voting System TADAYOSHI KOHNO ADAM STUBBLEFIELD† AVIEL D. RUBIN‡ DAN S. WALLACH§ February 27, 2004 Presented by: Aldo Villanueva

  42. Outline • Palm Beach Fiasco • Introducing DRE • History of Diebold • Vulnerabilities of Diebold DRE • Summary

  43. Palm Beach Ballot Fiasco

  44. Palm Beach Ballot Fiasco

  45. DRE “Direct Recording Electronic” • Eliminate paper ballots from the voting process. • Process: • The voter arrives to the voting place and prove he’s allowed to vote there. • He gets a token (PIN or smartcard). • Enters the token in the voting terminal and votes for its candidate. • DRE System presents the voter’s election and gives a final chance to make changes.

  46. History • 1995: I-Mark Systems • 1997: Global Election Systems acquired I-Mark • 2002: Diebold acquired GES and change the name to Diebold Election System • 2006: Diebold removed its name from the voting machines for “strategic” reasons • 2007: Diebold changed its name to "Premier Election Solutions"

More Related