1 / 61

The Role of Human Factors in Forensic Science

The Role of Human Factors in Forensic Science. Outline. Why Human Factors in Forensic Science Crime Lab Operations: Efficiency and Consistency Proficiency Testing Reports and Interpretation by Lawyers & Jurors Future Work. Human Factors in Forensic Science.

pseay
Download Presentation

The Role of Human Factors in Forensic Science

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Role of Human Factors in Forensic Science

  2. Outline • Why Human Factors in Forensic Science • Crime Lab Operations: Efficiency and Consistency • Proficiency Testing • Reports and Interpretation by Lawyers & Jurors • Future Work

  3. Human Factors in Forensic Science Tasks of the Forensic practitioner: • Prepares forensic evidence for analysis • Analyzes evidence according to procedures • Files report on results • Communicate results of analyses to legal community (judges, attorneys) and the public, jurors.

  4. Addressing Human Factors • People in any profession are influenced by their surroundings: cognitive, perceptual, organizational, social, cultural factors. • We need systems that reduce impact of biases that such influences may unconsciously inspire, especially when they affect conclusions. • Forensic scientists must communicate their findings accurately and clearly to diverse audiences, without understating or overstating the strength of the evidence.

  5. CSAFE Work • CSAFE researchers have conducted studies in these areas that are having significant impacts on forensic laboratory operations and on procedures for communicating results. • CSAFE teams have produced research on methods and best practices to: • Reduce the risk of error • Minimize the effects of cognitive bias • Improve proficiency testing • Evaluate the impact of human factors • Improve communication of forensic results.

  6. CSAFE Work • CSAFE researchers have conducted studies in these areas that are having significant impacts on forensic laboratory operations and are informing procedures on communicating of results. • Projects are organized around three areas: • Crime Lab Operations: Efficiency & Consistency • Proficiency Testing: Enhance methods for assessment and feedback to examiner and laboratory on skills and performance • Communication of findings: Reports and Interpretation

  7. CSAFE Work Our work would not have been possible without genuine collaboration with our partners: • Collaborative Testing Services (Proficiency exams) • Houston Forensic Science Center (HFSC) • Virginia Department of Forensic Sciences (VADFS) • Allegheny County Office of the Medical Examiner (ACOME)

  8. Crime Lab Operations: Efficiency & Consistency • How are data collected? • What processes are in place? • Can the processes be made more efficient? Projects: • LPE data, processing, decision-making at HFSC (Murrie et al.) • CSAFE-sponsored Quality Associate at HFSC (Maddisen Newman) • Objective criteria informative of evidentiary value (Kafadar)

  9. Project T: Forensic Processing and Human Factors at Crime Laboratories Daniel Murrie, Sharon Kelley, Brett Gardner, Karen Kafadar, & Lucy Guarnera University of Virginia Brandon Garrett Duke University

  10. Project T: Background • Examining case processing to gauge the basic reliability of latent print examination is a crucial first step in understanding and improving: • Statistical foundations for fingerprint evidence (which should allow for better measurements of effective laboratory performance) • Effects of altering case management procedures • Assessment of examiner-specific error rates • No prior work has examined case flow within crime laboratories generally, nor within latent print examination, specifically. • Primary goal: Collaborate with crime laboratories to study case processing and the influence of human factors within latent print examination. • This includes a variety of applications and sub-projects.

  11. Project T: Sub-projects • Laboratory case processing • Collection of data about case flow to gather baseline information (e.g., frequency of verification, consultation, and conflict resolution) and assess effects of examiner differences and changes to case management procedures (e.g., blind proficiency testing and blind verification) • Evidence submission forms • Evaluation of forms to document types of contextual information that have potential to bias processing of, and conclusions about, forensic science evidence • Cognitive bias/pattern evidence audit • Development of system for cognitive bias audits of pattern evidence through collaboration with Midwest Innocence Project • Forensic scientist surveys • Survey of forensic scientists to better understand types of information that they deem task-relevant, and how they conceptualize error in their work

  12. Project T: Sub-project Updates • Laboratory case processing • Development of Project MM • HFSC survey of pattern analysts • Blind Quality Control program data • Collaborations with two additional laboratories in VA and WA • Evidence submission forms • Forensic Science International publication • One in six forms (16.5%) request information that appears to have a high potential for bias without any discernible relevance to latent print comparison • Cognitive bias/pattern evidence audit • Coding of MIP records is ongoing • Forensic scientist surveys • Symposium entitled, “The Psychology of Forensic Science: Cognitive Bias and Contextual Effects in Forensic Science Analyses” • Science & Justice publication; another manuscript under review

  13. Project T: Lab Case Processing HFSC survey of pattern analysts—highlights: • Latent print & firearms examiners • Demographics, personality, values, beliefs regarding forensic science, perceptions of coworkers • Analysts had over 12 years work experience, and about 3.6 years of experience at HFSC • LPEs typically complete about 20 cases/month whereas firearms examiners most often complete 7 cases/month • LPEs testify once every year or less whereas firearms examiners typically testify more than twice yearly • LPEs endorsed a stronger tendency to minimize false positive errors than did firearm examiners • All firearm examiners endorsed a balanced approach to minimizing false positive/negatives except one • Estimates of false positive error rates ranged from 1-in-100 to 1-in-1 billion among LPEs, and from 1-in-100 to 1-in-1 quadrillion among firearms examiners

  14. Project T: Plans for Year 5 • Continue current data collection procedures to examine case processing outcomes after implementation of blind proficiency testing and blind verification • Compile, analyze, and interpret results from our HFSC survey of pattern evidence analysts evaluating how individual differences (e.g., personality, beliefs, values) among forensic analysts contribute to discrepancies in casework processing and performance in blind proficiency testing • Replicate and extend our work in latent print case processing to firearms case processing within HFSC • Replicate and extend our work with HFSC in additional crime laboratories (e.g., King County Crime Lab, Virginia Dept. of Forensic Science) • Finalize coding of MIP cases and share procedures and general principles so that cognitive bias audits can be carried out by other teams

  15. Project MM: Crime Lab Proficiency Testing and Quality Management Daniel Murrie, Sharon Kelley, Brett Gardner, & Karen Kafadar University of Virginia Brandon Garrett Duke University MaddisenNeuman, Alicia Rairden, & Preshious Rearden Houston Forensic Science Center

  16. Project MM: Primary Goals • Bring empirical research into the crime lab, beginning with a long-term collaboration with HFSC • Provide a model to increase the role of academic research in practicing forensic laboratories • Administration of blind quality control tests • Assessment of electronic data to improve case processing and reliability • Use of quantitative metrics to improve accuracy and reliability • Promote improvements within HFSC and among crime laboratories generally by: • Collecting data to inform the science and design further studies in quality control • Producing published research that examines critical issues such as error rates and the use of quality metrics in casework • Assessing the effects of process changes on the accuracy, reliability, and efficiency of crime labs • Inspiring other laboratories to seek out reciprocal relationships with like-minded researchers

  17. Project MM: Position Timeline Position Maintaining a large-scale blind quality control program Facilitating data collection for studies examining human factors topics within pattern evidence disciplines Collaborating on quality metrics to supplement blind quality control program Collecting electronic case processing records in pattern disciplines, primarily latent print comparison Serving as primary point of contact between CSAFE researchers and laboratory personnel • Approved by NIST (July 2018) • Blinding in Forensic Proficiency Testing and Casework workshop (Nov 2018) • Hired a Quality/Research Associate (Nov. 2018) • Supported by CSAFE and works full-time at HFASC • CSAFE visit to HFSC (Dec. 2018) • Orientation meeting discussing CSAFE goals and future studies/data collection • Duke University workshop sponsored by CSAFE (March 2019) • Florida International University Symposium presentation (May 2019)

  18. Project MM: Projects • Collection of Case Processing Data • Expand initial evaluations of latent print case processing by exploring case flow after the implementation of blind quality control procedures • Implementation of Print Quality Metrics • Use blind quality control fingerprint images to objectively evaluate the quality of all prints and explore whether quality differences are associated with important laboratory outcomes • Survey of Pattern Analysts • Survey analysts on a range of topics, including standard demographics, personality, attitudes and beliefs regarding error in forensic science, values, and perceptions of coworkers and current work load • Analysis of Blind Quality Control Program • Evaluate novel data resulting from a large blind testing program in the United States

  19. Project MM: Example: Blind Quality Control • First blind latent case entered November 2017 • 153 blind cases have been entered since that time • Average time from request to report date is 22 days • Quality staff rates all blind prints: • NLOV/Very little ridge detail: 20.4% • Some ridge detail: 40.9% • Good ridge detail: 35.5% *AFIS Negative: AFIS does not return matching exemplar in candidate list

  20. Project MM: Plans for Year 5 • Continue maintenance and expansion of HFSC quality systems • Compile HFSC case processing data from the 2018 and 2019 calendar years for analysis • Incorporate print quality metrics into blind quality control testing procedures • Disseminate findings describing the collaboration, and studies resulting from the collaboration, via conference presentations, workshop attendance, and manuscript preparation

  21. Project MM: Proficiency Testing: Assessment and Feedback

  22. Project V: Latent Print Proficiency Testing Daniel Murrie, Sharon Kelley, & Brett Gardner University of Virginia Brandon Garrett Duke University

  23. Project V: Background • Proficiency Test performance may not reflect true competence or proficiency in the field • Without such metrics, both internal oversight (e.g., by lab managers) and external oversight (e.g., by accrediting bodies) may become more complicated and less meaningful • Primary goal: To better understand (and eventually improve) the process and validity of current proficiency testing efforts • Methodology: Evaluate the current state of proficiency testing via subjective and objective methods

  24. Project V: Method • Surveyed latent print examiners after completing Collaborative Testing Services proficiency tests: perceptions of test items, test-taking procedures • We examined: • survey responses • test performance • print quality metrics

  25. Project V: Key Findings 2017 Test 2019 Test Survey results received in April 2019 Test results & print quality metrics TBD 198 examiners responded to survey We seek to understand: Respondents’ characteristics Test-taking procedures Perceptions of test items: Difficulty, clarity, similarity to casework • Only 3% (14 of 438) of examiners gave an erroneous response • Examiners described the test as fairly easy and were highly confident in their answers • Examiners perceived more difficult test items as more similar to routine casework • Test items perceived as more challenging and more similar to casework contained lower-quality prints • Quality metrics suggest that virtually all included prints were very high quality

  26. Project V: Preliminary Resultsof 2019 Survey Examiners Procedure Most labs (78%) require no proficiency testing beyond CTS Examiners typically completed testing in 9.5 hours (median = 6 hrs, range 1-50 hrs) 66% of tests had been verified by another examiner • Avg. 12 years of experience • 1 to 36 years • 64% of examiners testify between 1 and 5 times yearly • 51% had been questioned about proficiency testing • 37% had been questioned about error rates

  27. Project V: Preliminary Results Perceptions of test items (0 = extremely easy/poor clarity; 10 = extremely difficult/high clarity) • Avg. Print Difficulty: 3.5 • Avg. Latent Clarity: 6.8 • Avg. Known Clarity: 8.0 • Examiners who thought test was more difficult also thought test was more representative of typical casework • r(169) = .27, p < .001 Similarity to casework

  28. Project V: Plans for Year 5 • Collect test performance and quality metric data from second CTS study • Interpret findings and disseminate results • Conference presentations • Manuscript preparation • Provide additional feedback and recommendations to CTS based on results • Design another survey for inclusion in the Spring 2020 CTS proficiency test • Expand upon our examinations of the content (e.g. perceived item difficulty) and process of proficiency testing (e.g., examiner effort during testing vs. routine casework)

  29. Project V: VADFS Latent Print Analysis:Examiner or LatentSleuth? • 113 prints judged as low to medium-low quality • Each case: Examiner time vs LatentSleuth • LatentSleuth requires Examiner set-up time • LS examiner did not know Examiner’s determination • Does LS tool reduce time examining a case? Linda Jackson(Director), Jessica Davis, Sabrina Cillissen

  30. Project V: Time difference (LatentSleuth - Examiner), minutes Low: -115, -92      -5 | 97      -4 | 30      -3 |      -2 | 6530      -1 | 988643333300      -0 | 888888655555544433322222       0 | 1113344445555566666788899 Mean difference ~5.min (SE 2 min)       1 | 000012222233335555799       2 | 0022225       3 | 1237       4 | 077       5 | 7       6 | 03       7 | 0       8 | 8 High: 130, 248 - 2+, 4 hours longer??

  31. LS time ~ 4.6(Examiner Time)^0.60Case time > 45 min: LatentSleuth helpsStudy those outlier cases Project V

  32. Project BB:Blind Proficiency Testing: Designing a methodology for forensic laboratories Robin Mejia CMU Bill Eddy CMU Maria Cuellar University of Pennsylvania Other Participants: Many

  33. Project BB: Background • Blinding is not the norm in forensic laboratories in case work or in proficiency testing • Research has long shown that blind proficiency tests can provide different results and information than open proficiency tests (for example, Lamotte et al, 1977) • National Academy of Sciences report (1992): laboratories should routinely engage in blind proficiency testing. • 2016 National Commission on Forensic Science Recommendation to Attorney General: “require all DOJ FSSPs [forensic science service providers] to seek proficiency testing programs that provide sufficiently rigorous samples that are representative of the challenges of forensic casework.” • Blind proficiency testing: still not standard in forensic labs (Koehler, 2013) even though it has been successfully implemented and proven to be beneficial in other fields.

  34. Project BB: Highlights- Implementation • Implement blind proficiency testing at the Allegheny County Medical Examiner’s Office (ACOME) • Start with toxicology: blood vials to test for blood alcohol concentration (HFSC) • The samples: • Purchased vials from Research Triangle International (RTI). • Selected a set of concentrations of blood alcohol levels that are representative of real-life samples. • HFSC agreed provide their test samples to ACOME. Laboratories collaborate to share costs of blind proficiency testing. Currently considering issues • Submission of samples: • Partner with Pittsburgh Police to submit the samples as if they were real casework (cf. HFSC & Houston Police Dept) • Quality Manager at ACOME is organizing the blind tests. • Challenges that remain: • Finalize police partnership. • Creation of realistic-looking cases.

  35. Project BB: Highlights - Workshop • Nov 1-2, 2018: Workshop at the Allegheny County Office of the Medical Examiner • Target Audience: Forensic laboratory directors and quality mangers, researchers • Key finding: Laboratories want to implement blind proficiency testing, need support to make it happen • HFSC is leading the way (by a decade) • Challenges: Logistics, Culture • Identified issues highlighted on following slides

  36. Project BB: Highlights

  37. Project BB: Issues - Logistics • Creating realistic samples • “The handwriting’s too neat and there’s no spelling mistakes. There’re no way a cop could have submitted it.” • Most labs do not have blinding in casework, so to blind a proficiency test, they must build an entire case. • Paying for them • Will augment existing open testing, not replace it. • Houston estimated blinding testing program costs about as much as open testing. • Reporting • Local level – not releasing the case • Dealing with CODIS, AFIS, etc. • Reporting at a community level (will come back to this) • Coordinating between labs that are implementing blind proficiency testing

  38. Project BB: Issues - Logistics • Laboratory Information Management Systems: • “None of them can handle sequential unmasking” • “They’re all inadequate” • “The thing you learn as a lab director is that the LIMS is more powerful than you are” • Houston selected JusticeTrax 5 because it was the only one that could create a field called “proficiency” and blind it to everyone but quality

  39. Project BB: Issues - Culture/ History • Many forensic examiners have historically claimed error rates of zero. Major culture change for examiners and managers to admit that errors happen, and to focus on identifying the problems that causes errors, is a challenge. • Forensic analyses occur in an adversarial context. Proficiency tests results are discoverable, and can be used to challenge examiners in court. • Laboratory staff report into a law enforcement chain of command; need buy-in from senior management (can require significant education). • Small labs will find it more challenging to implement blind testing as they lack the staff and financial resources of a laboratory such as HFSC.

  40. Project BB: Progress • Invited 7 labs to workshop; all sent representatives: • 8 quality managers, 4 laboratory directors, a chief medical examiner, and a director of education • Lab sizes ranged from single laboratory ( < 50 employees) to a 7-lab system with over 200 employees. • Virginia had just distributed its first 3 blind test samples • Kentucky: 6 blind drug cases ready to go; change in command. • Others in progress (e.g., Miami-Dade), maybe more • Strong desire to collaborate: sharing of SOPs and actual blind tests; request for coordination by CSAFE. • Inter-lab studies, aggregation of results (take care not to claim this is a substitute for error rate studies) • Repository for best practices, actual blind tests, etc. • Publications and setting standards

  41. Project BB: Next Steps • Support laboratories implementing blind proficiency testing • Survey laboratory quality managers about information policies and LIMS systems • Analysis of blind proficiency test results from HFSC and others • Creation of a network of laboratories pursuing blind testing • Publication of best practices by writing group formed at fall meeting

  42. Communication Of Findings: Reports and Interpretations

  43. Project E:Analysis of Forensic Testimony and Reports • Simon A. Cole • University of California, Irvine • Doctoral students • AlyseBertenthal, Matt Barno, Valerie King University of California, Irvine

  44. Project E: Background/Problem • CSAFE “ . . . works to build a statistically sound and scientifically solid foundation for the analysis and interpretation of forensic evidence . . .” • To what extent do we have “statistically sound . . . interpretation of forensic evidence” now? • Can provide a baseline for CSAFE’s efforts

  45. Project E: Technical approach

  46. Project E: Data sets used

  47. Project E: Results

  48. Project E: Plans for Year 5 • Revise AAFS paper for publication. • Continue collecting data on laboratories’ SOPs. • Finalize laboratory directors’ survey. • Begin data cleaning on National Registry of Exonerations forensic data and then move on the analysis. • Presentation at annual meeting of Society for Social Studies of Science, September, 2019. • Presentation at University of Houston Law Center symposium Forensic Science Ten Years after the National Research Council Report, September, 2019.

  49. Project I:Evaluating Lay Perceptions of Forensic Evidence and Forensic Statistics William C. Thompson University of California, Irvine Faculty Researchers Hal Stern, Nicholas Scurich, Simon Cole Post-Docs: Naomi Kaplan Damary; NikiOsborne Graduate Students: Rebecca Grady, Eric Lai Visiting Scholars Gianni Ribiero University of Queensland International Collaborators: Alex Biedermann, Franco Taroni, Joelle Vuille University of Lausanne

  50. Project I: Problem • How best to communicate forensic science to lay individuals (e.g., judges, lawyers, jurors)? • How to assure correct understanding of probative value? • How to minimize misunderstandings, fallacious thinking and cognitive errors?

More Related