1 / 42

Walt Disney World’s Biometric System

Walt Disney World’s Biometric System. providing. “Usability Magic”. Diplomacy. Disclaimer. Disclosure. Usability of Biometric Systems. May 23, 2007. http://www.noblis.org/BiometricIdentificationClusterGroup.asp.

chandler
Download Presentation

Walt Disney World’s Biometric System

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Walt Disney World’s Biometric System providing “Usability Magic”

  2. Diplomacy Disclaimer Disclosure

  3. Usability of Biometric Systems May 23, 2007 http://www.noblis.org/BiometricIdentificationClusterGroup.asp These tests were performed for the Department of Homeland Security in accordance with section 303 of the Border Security Act, codified as 8 U.S.C. 1732. Specific hardware and software products identified in this report were used in order to perform the evaluations described in this document. In no case does such identification imply recommendation or endorsement by the National Institute of Standards and Technology, nor does it imply that the products and equipment identified are necessarily the best available for the purpose. Mary Theofanos NIST 301 975-5889 mary.theofanos@nist.gov 3

  4. What are our key operational issues? “Auto-enrollment” - enrollment must occur like verification; no time for multiple presentations Maximum device intuitiveness; no opportunity for training or instruction (potential language barriers) We define throughput as the average transaction time for all transactions from insertion of ticket to time of next insertion, including failures to acquire, repeat attempts, etc. Device technology must be acceptable to the user population We felt we had set “the bar” for ourselves internally...

  5. 35 30.4 30 25 19.0 20 Seconds / Guest 16.0 14.7 15 11.5 Current Non-Biometric Turnstile Time (5.5) 10 5 0 Aug-95 Mar-96 May-96 Nov-96 Jan-97 Throughput improvements: Around 9.0 seconds is where we had hoped to get to… Early Throughput Time History

  6. WDW Hand Geometry System with trained Users: EER ≈ 9%

  7. Testing in 2000 • Our testing provided us with incomplete results due to a • lack of comprehensive interface standards at the time; However, we did find out what we needed to know…

  8. Time includes entry of PIN? Transaction Time (Seconds) System Mean Median Minimum Face 15 14 10 Excluded Fingerprint-Optical 9 8 2 Excluded Fingerprint-Chip 19 15 9 Excluded Hand 8 4 Included 10 Iris 12 10 4 Included Vein 18 16 11 Included Voice 12 11 10 Excluded Ref: Tony Mansfield, Et Al., “Biometric Product Testing”, Version 1.0 March 19, 2001, Center for Mathematics and Scientific Computing (CMSC) of the National Physical Laboratory, U.K. We still thought that finger-scanning technology offered the fastest potential transaction possibilities User throughput User transaction times And this publication confirmed our instincts…

  9. What have we done? “Magic Your Way!”

  10. “Magic Your Way!” The environment is the same; however now…. We require biometric verification of ALL guests age 10 or older. This population still includes elderly, foreigners, adolescents, disabled (and we still allow “opting-out”). Our guests must still be able to present their biometric feature while wearing sunglasses, hats, suntan lotion, and jewelry. In addition, they are still carrying children, tote bags, food, strollers, etc. Our guests are still often sweating, tired, anxious, and confused.

  11. Comments on Doddington’s Zoo: …The animals in the zoo are mainly relevant to system performance in terms of accuracy. I propose that there are other animals that are relevant mainly to throughput: Allow me to introduce the Biometric Circus McGurkus: (apologies to Dr. Seuss) • The Flummox – This is the animal that Darwin might eventually take care, but that hasn’t happened yet, so you will have to deal with them. This is also known as the “throughput killer” or “95th percentile person” • The Juggling Jot – This is the animal with their off-spring in a pouch, food and water reserves in some adapted appendage, and quite often showing signs of excitement and stress leading to repeated presentation attempts • The Wink-hooded Hoodwink - This is the crafty, sneaky animal with the self-proclaimed, superior intellect that is going to test the system by switching fingers, simultaneously proclaiming innocence and ignorance

  12. Ref: Mary Theofanos, “Usability of Biometric Systems”, May 23, 2007, National Institute of Standards and Technology (NIST) , USA secondary opportunity Opportunity (signaled/unsignaled) Participant approaches System starts capture primary opportunity Participant presents Attempt starts System Capture Attempt Attempt ends System ends capture Participant Capture thresholding Capture repeated (unacceptable attempt) Next attempt (acceptable attempt) Holistic System View Time 14

  13. Ref: Mary Theofanos, “Usability of Biometric Systems”, May 23, 2007, National Institute of Standards and Technology (NIST) , USA secondary opportunity Opportunity (signaled/unsignaled) Participant approaches System starts capture Participant presents Attempt starts Affordance – I know what to do! Invitance – Let me at it! 15

  14. So, we held the vendor responsible for all of the technical development of the sensor package itself, and we took ownership of the sensor enclosure and ergonomic interface elements: • Enclosure mock-ups would be created for visual acceptability • Ergonomic interface mock-ups would be created and tested to compare effectiveness • Testing requirements would be established and milestones would be set in the schedule that allowed for “go/no-go” checkpoints So, let’s take a look at some examples of the intensity of the efforts that were involved…

  15. In May of 2005, the vendor delivered ten (10) alpha prototypes that were retrofitted into commercially available optical TIR scanners We used these to conduct a Phase I - Technology test, as well as to study the human factors of what people would do when instructions were minimal

  16. We also surveyed our test users, with some interesting findings: Q1: It would be easy to to figure out how to use this device without any instruction. > 70%

  17. We also surveyed our test users, with some interesting findings: Q2:I would not be concerned about privacy issues when using this device. > 54%

  18. We also surveyed our test users, with some interesting findings: Q4:What did you like about this device? Other Fast Easy

  19. We also surveyed our test users, with some interesting findings: Q5:What did you not like about this device? Hygiene Nothing (no improvement needed Other Security/privacy

  20. In Sept. of 2005, we received the first beta test unit (it was nicknamed OCAM: One CAMera) and it supposedly came with full sunlight capabilities…

  21. Early enclosure mock-ups made from foam-core to match what we might end up with for an package…

  22. Early ergonomic interface mock-ups that led to the final design…

  23. We did some “quiet”, early testing in Nov. 2005 with guests and a mocked-up enclosure and some ergonomic interface mock-ups to see if we could learn enough about what worked and what didn’t…

  24. Even during the brief OCCAM Beta Test, we were able to simply gesture and have untrained users successfully get authorized at a much faster pace…

  25. OCCAM Beta Test results from Nov.16-18, 2005 • Presentations by >800 guests (one turnstile, 3 partial days) • Various sunlight intensities, one hour of nighttime use • Tested 4 types of user interface surface: • The Post • The Tall • The X • None • Tested 3 levels of Yaw at 0, 15, and 30 degrees • Tested 3 levels of pitch at 0, 5, and 10 degrees

  26. Categorizing sources of finger-scanning rejection • We divided rejections into 7 categories of placement errors, where it was discernable : • Fingertip presented • Finger not flat • Side of finger presented • Finger motion during capture • Early lift-off of finger during capture • Finger rotated over 15 degrees from axis • Finger too far forward • 10 degree forward pitch appeared to be best angle for reducing finger placement problems • Degree of Yaw seemed to make no difference (15 degreses was decided upon to

  27. OCCAM Beta Test II results from Dec. 14, 2005 • Presentations by >550 guests (one turnstile, 1 partial day) • Tested top 2 types of user interface surface (one modified; to be called hockey puck): • Ramp @ 0 degrees Yaw (total placement error = 4.9%) • Ramp @ 25 degrees Yaw (total placement error = 5.86%) • Hockey Puck @ 25 degrees Yaw • (total placement error = 4.59%) We had a winner!

  28. So we ended up with an ergonomic design that looked something like this… Novel enough to file for a patent…

  29. Current distance from platen surface to surface of metal = 2.2 mm (0.0863”) • Suggest small arc be added to prevent finger from sliding too far forward. • Suggest arc be ~ 2.8 mm high. • Top of arc would be 5.0 mm from platen surface. • Arc would come up to edge of platen.

  30. Preliminary data from internal Beta tests with untrained staff as users: ERR < 3%

  31. So what was the end result of the project? • Still a one-to-one verification • Verification is still anonymous; we do not know the identity of the passholder at the turnstile • Our enrollment is still transparent - single feature presentation with truly minimal operator intervention or feedback • We are now operating at approx. 11.5 seconds average per transaction, which was our goal…and our accuracy is is in the 5-6% FAR range with a total rejection rate of 11-12% and well below 1.0% FTA

  32. Don’t ask me – ask them… 49,000,000 x 70% = 34,000,000 In our 1st full year of new sensor operation we’ve processed over twice the number of people than were processed in the previous 10 years of operations! <each sensor has had well over 200,000 touches so far…>

  33. What have we done since implementation? Put them in at Hong Kong Disneyland! But what’s wrong with this picture?

  34. In conclusion: Our lessons learned: There is still no magic bullet device for all applications, but we did build a better “mousetrap” Implementing an enterprise-sized biometric system upgrade with a proprietary data interface is both costly and painful to accomplish A change in biometric technology does not solve all your problems, it exchanges them for other problems

  35. Questions?

More Related