1 / 22

Collecting Multimodal Biometric Data

Collecting Multimodal Biometric Data. Ross J. Micheals Image Group, Charlie Wilson , Manager Information Access Division, Martin Herman , Chief National Institute of Standards and Technology International Meeting of Biometrics Experts 23 March 2004. Challenges.

winifred
Download Presentation

Collecting Multimodal Biometric Data

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Collecting Multimodal Biometric Data Ross J. Micheals Image Group, Charlie Wilson, Manager Information Access Division, Martin Herman, Chief National Institute of Standards and Technology International Meeting of Biometrics Experts 23 March 2004

  2. Challenges The United States government has no multimodal database of face, fingerprint, andiris images suitable for evaluation.

  3. Multimodal Biometrics • Initial motivation: Collect an iris image database • Data collections have substantial fixed costs • Additional sensors are relatively less expensive • Extension of original goal: Collect a multimodal biometric database

  4. Iris Recognition • Iris images are an ICAO (International Civil Aviation Organization) approved biometric • Large market expansion anticipated in early 2005 at expiration of iris recognition concept patent • Iris recognition systems have been deployed internationally and are in operation today

  5. Multimodal Biometrics • There are inherent correlations among different biometric modalities • NIST Face Recognition Vendor Test • Young females (face vs. fingerprints) • Chinese (face vs. iris?) • More data is an opportunity to discover additional relationships • Multimodal data is being collected right now, every day (US-VISIT)

  6. MBARKMultimodal Biometric Accuracy Research Kiosk • MBARK is an externally deployable, multimodal biometric acquisition and information system • NIST as the maintainer, and synchronizer, and gatekeeper • Two major purposes: • To collect biometric data • To obtain data about collecting biometrics • Multi-agency project • Department of Homeland Security (S&T, TSA) • Intelligence Technology Innovation Center (ITIC) • Department of State

  7. MBARKMultimodal Biometric Accuracy Research Kiosk • Current goal for one MBARK session • Eighteen face images (two sets of nine each) • Forty fingerprints (two sets on two sensors) • Four iris images (two sets of two each) • Most of the data will be sequestered for use in future evaluations • Small portions of the data will be released for scientific and research purposes

  8. Aside:Privacy • Rule of thumb“Would wewant to be in the database?” • Suppose we release face, fingerprint, and iris images of a subject in the database • Critical to ensure that multiple modalities could not be synchronized outside of NIST and Privacy Act protection • Conclusion: Release one and only one modality per subject externally

  9. Research & Operational Needs • Data collections should address a real operational need or a specific research question • Data collected to evaluate a deployed system would be an operational motivation • The design of MBARK reflects a mixture of operational and research needs • MBARK • Face: Operational and research • Fingerprint & Iris: Operational

  10. MBARK : Face • Nine color cameras • Five-megapixels per image • Olympus C5050Z • Some reliability problems • Operational • Multiple images • Research • Multiple images (FRVT 2002) • Texture-based • Image-based 3D

  11. MBARK: Fingerprint • Optical slap scanners • Smiths-Heimann LS2 • CrossMatch ID500 • Operational • Ohio WebCheck • Sensor comparisons

  12. MBARK: Iris • Oki IrisPass-WG • Near infrared illumination • Grayscale iris images • Two irises in one sitting • User does not need to manipulate camera • Primarily an operational driven component

  13. MBARK: Registration • Identification of subjects returning later • Using a well-studiedmodel (US-VISIT) as an aid to identify subjects on return visits • Single fingerprint scanner • CrossMatch Verifier 300

  14. Open Systems • NIST evaluations are typically with an emphasis on open systems • Ensures interoperability among components • Prevents deployments from being locked into any particular vendor • Requires component evaluations • Example: Face Recognition Vendor Test (FRVT) and Fingerprint Vendor Technology Evaluation (FpVTE) compared algorithms over a set of common images

  15. System vs. Component Evaluation • Iris-recognition market is system oriented • I.e., what you buy is meant to be used in an end-to-end system, rather than an interoperable component • How does this effect image-based evaluations? • Hypothetical example: • “MH Electrics,” iris camera manufacturer • “EyeRidian,” iris recognition software

  16. Control Software Iris Recognition System Iris Camera Model MH 5000 Iris Recognition Algorithm EyeRidian v1.2

  17. Recognition Image Quality Iris Recognition System Iris Recognition Algorithm EyeRidian v1.2 Iris Recognition Algorithm EyeRidian v1.2 For “high” quality images, recognition is 99.99% But, suppose only 70% of all data is high quality.

  18. Recognition Image Quality Iris Recognition System Iris Recognition Algorithm EyeRidian v1.2 Iris Recognition Algorithm EyeRidian v1.2

  19. If there are no images of sufficient quality, the sensor reports a failure to acquire (FTA). FTA data is usually not available for image-based evaluations. Iris Recognition System Iris Camera Model MH 5000 Iris Recognition Algorithm EyeRidian v1.2 MH Control Software Image Quality What about the 30% of images that are not “high” quality? How might other algorithms do on these images?

  20. Iris Recognition System Iris Camera Model MH 5000 Iris Recognition Algorithm EyeRidian v1.2 MH Control Software Image Quality

  21. Conclusion • In component testing, be aware of the internals of each component and how evaluations might be effected • For some modalities, we can reduce bias by using a mix of sensors • Example: Many fingerprint scanners all with different control logic • For other modalities, testing components requires more sensitivity • The degree of this minimization depends on the state of the market and vendor support

  22. Questions?

More Related