1 / 19

BENCHMARKING FINGERPRINT ALGORITHMS

BENCHMARKING FINGERPRINT ALGORITHMS. Dr. Jim Wayman, Director US National Biometric Test Center San Jose State University email: biomet@email.sjsu.edu. CONSTRUCT MODEL CONDUCT EXPERIMENTS TO DERIVE MODEL PARAMETERS ERROR ANALYSIS SMALL SAMPLE SIZE GENERALIZABILTY OF SAMPLE POPULATION.

karl
Download Presentation

BENCHMARKING FINGERPRINT ALGORITHMS

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. BENCHMARKING FINGERPRINT ALGORITHMS Dr. Jim Wayman, Director US National Biometric Test Center San Jose State University email: biomet@email.sjsu.edu

  2. CONSTRUCT MODEL CONDUCT EXPERIMENTS TO DERIVE MODEL PARAMETERS ERROR ANALYSIS SMALL SAMPLE SIZE GENERALIZABILTY OF SAMPLE POPULATION MAKING SCIENTIFIC PREDICTIONS

  3. COMPETING DESIGN REQUIREMENTS • THROUGHPUT RATE • NUMBER OF FALSE MATCHES • PROBABILITY OF FALSE NON-MATCH • HARDWARE COSTS

  4. 5 INTER-DEPENDENT OPERATIONAL PARAMETERS • HARDWARE COMPARISON RATE • PENETRATION RATE • BIN ERROR RATE • FALSE MATCH RATE • FALSE NON-MATCH RATE

  5. HARDWARE MATCH RATE • NUMBER OF COMPARISONS PER SECOND • 8,000 TO 300,000+ AVAILABLE • SEVERAL DOLLARS PER MATCH PER SECOND

  6. PENETRATION RATE • PERCENTAGE OF THE DATABASE THAT WILL BE COMPARED TO AVERAGE INPUT SAMPLE • “BINNING” BASED ON ENDOGENOUS MEASURES • “FILTERING” BASED ON EXOGENOUS MEASURES

  7. BIN ERROR RATE • MATCHING PRINTS PLACED IN DIFFERENT BINS • BIN ERRORS LEAD DIRECTLY TO FALSE NON-MATCHES • PENETRATION AND BIN ERROR RATE TRADE-OFF

  8. FALSE MATCH RATE • PROBABILTY THAT TWO COMPARED PRINTS WILL BE INCORRECTLY FOUND TO MATCH

  9. FALSE NON-MATCH RATE • PROBABILITY THAT TWO COMPARED PRINTS WILL BE INCORRECTLY FOUND NOT TO MATCH • COMPETES WITH FALSE MATCH RATE

  10. SYSTEM ERROR RATES • M INDEPENDENT PRINTS • FIRST-ORDER APPROXIMATIONS • ERROR BOUNDS

  11. SYSTEM THROUGHPUT • THROUGHPUT AND ERROR RATES LINKED TO PENETRATION • DOMINATED BY HUMAN FACTORS FOR SMALL-SCALE SYSTEMS

  12. SYSTEM EQUATIONS • FALSE NON-MATCH • FALSE MATCH • THROUGHPUT

  13. TESTING DATABASE • 4,080 “TRAINING” PRINTS • BEST QUALITY POSSIBLE • 80 IDENTIFIED “PRACTICE” PRINTS • 4,128 “TEST” PRINTS • BEST QUALITY EXPECTED IN OPERATION • 3,276 MATCH ONE OR MORE “TRAINING” PRINTS

  14. ANALYSIS • BINNING OF “TRAINING” PRINTS • BINNING OF “TEST” PRINTS • MATCHING RESULTS

  15. SMALL-SCALE VENDORS • SELF-SELECTED CATEGORY BASED ON ABILITY TO PERFORM 17 MILLION COMPARISONS • VENDORS SUPPLY COMPILED CODE • SCANNER SPECIFIC ALGORITHMS LIMIT USEFULNESS OF RESULTS

  16. SMALL-SCALE RESULTS

  17. CONCLUSIONS • SYSTEM MODELS ARE UNDERSTOOD • CONFIDENCE INTERVALS ARE BECOMING UNDERSTOOD • TESTING FROM CANNED DATABASES MAY NOT ALWAYS PRODUCE REASONABLE PERFORMANCE ESTIMATIONS

More Related