1 / 26

Technology Assisted Review: Trick or Treat? Ralph Losey , Esq., Jackson Lewis

Technology Assisted Review: Trick or Treat? Ralph Losey , Esq., Jackson Lewis. 1. Ralph Losey , Esq. Partner, National e-Discovery Counsel, Jackson Lewis Adjunct Professor of Law, University of Florida Active member, The Sedona Conference

lesley
Download Presentation

Technology Assisted Review: Trick or Treat? Ralph Losey , Esq., Jackson Lewis

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Technology Assisted Review: Trick or Treat? Ralph Losey, Esq., Jackson Lewis 1

  2. Ralph Losey, Esq. • Partner, National e-Discovery Counsel, Jackson Lewis • Adjunct Professor of Law, University of Florida • Active member, The Sedona Conference • Author of numerous books and law review articles on e-discovery • Founder, Electronic Discovery Best Practices (EDBP.com) • Lawyer, writer, predictive coding search designer, and trainer behind the e-Discovery Team blog (e-discoveryteam.com) • Co-founder with son, Adam Losey, of IT-Lex.org, a non-profit educational for law students and young lawyers 2

  3. Discussion Overview • What is Technology Assisted Review (TAR) aka Computer Assisted Review (CAR)? • Document Evaluation • Putting TAR into Practice • Conclusion 3

  4. What is Technology Assisted Review? 4

  5. Why Discuss Alternative Document Review Solutions? • Document review is routinely the most expensive part of the discovery process. Saving time and reducing costs will result in satisfied clients. Traditional/LinearPaper-BasedDocument Review Online Review Technology Assisted Review

  6. Bobbing for Apples: Defining an effective search Information retrieval effectiveness can be evaluated with metrics Precision All documents • Fraction of relevant documents within retrieved results – a measure of exactness Recall • Fraction of retrieved relevant documents within the total relevant documents – a measure of completeness F-Measure Hot • Harmonic mean of precision and recall Not

  7. Bobbing for Apples: Defining an effective search Information retrieval effectiveness can be evaluated with metrics Precision • Fraction of relevant documents within retrieved results – a measure of exactness 1) Perfect Recall; Low precision Recall • Fraction of retrieved relevant documents within the total relevant documents – a measure of completeness F-Measure Hot • Harmonic mean of precision and recall Not

  8. Bobbing for Apples: Defining an effective search Information retrieval effectiveness can be evaluated with metrics Precision • Fraction of relevant documents within retrieved results – a measure of exactness 2) Low Recall; Perfect Precision Recall • Fraction of retrieved relevant documents within the total relevant documents – a measure of completeness F-Measure Hot • Harmonic mean of precision and recall Not

  9. Bobbing for Apples: Defining an effective search Information retrieval effectiveness can be evaluated with metrics Precision • Fraction of relevant documents within retrieved results – a measure of exactness Recall 3) Arguably Good Recall and Precision • Fraction of retrieved relevant documents within the total relevant documents – a measure of completeness F-Measure Hot • Harmonic mean of precision and recall Not

  10. Key Word Search • Key word searches are used throughout discovery • However, they are not particularly effective • Blair and Maron- Lawyers believed their manual search retrieved 75% of relevant documents, when only 20% were retrieved • It is very difficult to craft a key word search that isn’t under-inclusive or over-inclusive • Key word search should be viewed as a component of a hybrid multimodal search strategy Go fish!

  11. Where are we?

  12. What Is Technology Assisted Review (TAR)?

  13. Classification Effectiveness • Any binary classification can be summarized in a 2x2 table • Test on sample of n documents for which we know answer • A + B+ D + E = n 13

  14. Classification Effectiveness • Recall = A / (A+D) • Proportion of interesting stuff that the classifier actually found • High recall of interest to both producing and receiving party 14

  15. Classification Effectiveness • Precision = A / (A+B) • High precision of particular interest to producing party: cost reduction! 15

  16. Sampling and Quality Control How precise were you in culling out from your bag of 10,000 and ? • Want to know effectiveness without manually reviewing everything. So: • Randomly sample the documents • Manually classify the sample • Estimate effectiveness on full set based on sample • Sampling is well-understood • Common in expert testimony in range of disciplines Sample size = 370 (Confidence Interval: 5; Confidence Level: 95%) 300 370 Precision: 81% 16

  17. TREC 2011 • Annual event examining document review methods [T]he results show that the technology-assisted review efforts of several participants achieve recall scores that are about as high as might reasonably be measured using current evaluation methodologies. These efforts require human review of only a fraction of the entire collection, with the consequence that they are far more cost-effective than manual review. -Overview of the TREC 2011 Legal Track 17

  18. Putting TAR into Practice 18

  19. TAR or CAR? A Multimodal Process Must… have… humans!

  20. The Judiciary’s Stance • Da Silva Moore v. PublicisGroupe • Court okayed parties’ agreement to use TAR; parties disputed implementation protocol (3.3 million documents) • Kleen Products v. Packaging Corp. of Am. • Plaintiffs abandoned arguments in favor of TAR and moved forward with Boolean search • Global Aerospace Inc. v. Landow Aviation, L.P. • Court blessed defendant’s use of TAR over plaintiff’s objections (2 million documents) • In re Actos (Pioglitazone) Products Liability Litigation • Court affirmatively approved the use of TAR for review and production • EORHB, Inc., et al v. HOA Holdings, LLC • Court orders parties to use TAR and share common ediscovery provider

  21. TAR/CAR: Tricks & Treats • Must address risks associated with seed set disclosure • Must have nuanced expert judgment of experienced attorneys • Must have validation and QC steps to ensure accuracy • TAR canreduce time spent on review and administration • TAR can reduce number of documents reviewed, depending on the solution and strategy • TAR can increaseaccuracy and consistency of category decisions (vs. unaided human review) • TAR can identify the most important documents more quickly

  22. TAR Accuracy -U.S. Magistrate Judge Andrew Peck in Da Silva Moore

  23. Conclusion 23

  24. Parting Thoughts • Automated review technology helps lawyers focus on resolution – not discovery – through available metrics • Complements human review, but will not replace the need for skillful human analysis and advocacy • Search adequacy is defined in terms of reasonableness, not whether all relevant documents were found • TAR can be a treat, but only when implemented correctly • Reconsider, but do not abandon, the role of: • Concept search • Keyword search • Attorney review 24

  25. Q & A 25

More Related