1 / 33

Tips & Tricks for Developing or Improving Your Auditing Function

Tips & Tricks for Developing or Improving Your Auditing Function. Leslie M. Howes, MPH, CIP Director, Office of Human Research Administration Harvard School of Public Health Harvard Medical School & Dental School. Today’s Outline. Audit Basics Key Elements Audit Process Preparation

rumor
Download Presentation

Tips & Tricks for Developing or Improving Your Auditing Function

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Tips & Tricks for Developing or Improving Your Auditing Function Leslie M. Howes, MPH, CIP Director, Office of Human Research Administration Harvard School of Public Health Harvard Medical School & Dental School

  2. Today’s Outline • Audit Basics • Key Elements • Audit Process • Preparation • Onsite Engagement • Report & Resolution • Follow-Up • Metrics & Trends • Questions

  3. Auditing Basics One working definition… A systematic and independent examination of study- related activities and regulatory documentation to evaluate protocol adherence and compliance with regulation, institutional policy, ethical guidance, and applicable guidelines Audit types • Not for cause (“Routine”) • For cause (“Directed”)

  4. Creating a Strong Audit Function: Key Elements

  5. Key Elements • Understand your “client” • Define Scope and Authority • Obtain Institutional Support • Obtain Research Community Buy-in • Establish Goals/Focus • Develop SOPs and Templates

  6. Who’s Your Client • Institutional culture • Principal Investigators’ involvement in their studies • Organizational/Departmental structure/hierarchy • Who’s likely to take advantage • Research portfolio • FDA vs non-FDA regulated • Biomedical vs Socio-behavioral • Full Board vs Expedited review • Study populations • Domestic vs International

  7. Scope & Authority • Auditing (for cause/not for cause) • Investigators only • IRB only • Both Investigators and IRB • Auditing + education • Auditing + education + human research support • IRB submission assistance • External audit prep • Study consultation • On-call research coordinator, etc.

  8. Scope & Authority, cont. • Reportable Information, Noncompliance, Suspensions & Terminations • Identify • Refer/Report to IRB (with recommendation) • Make official determination • Reprimand Investigators • Conduct further investigation • Assist PI in reporting Tip: If the IRB is responsible for making official determinations, develop an escalation plan –be transparent with your research community

  9. Institutional Support • Identify Key Constituents • Institutional Official, VP of Research, Department Chairs/Chiefs, IRB Director/Chairs • Convince the leadership with data • Specific “business plan” for your audit function • If available, your own audit findings • Relevant FDA Warning Letters, OHRP Determination Letters • Borrow “data” from other/neighboring institution • Less is more • Don’t ask for “too much” in terms of resources (in the beginning)

  10. Research Community Buy-In • Provide Incentives • Human research training credit • IRB Submission Assistance support • Not for cause audit “free pass” for defined time period • Pilot different approaches • Voluntary • 360o Review • Seek feedback from current “clients” and share with the research community • Share relevant “common findings” with investigators

  11. Establish Goals • S.M.A.R.T • Specific • Measurable • Attainable • Realistic • Time-related • Flexible/Creative • Anything is better than nothing • Regular adjustment • Maximizing efficiency and effectiveness Consider the following goals, what’s more realistic? A. Offer 3 departments the opportunity for an onsite review vs B. Conduct 30 onsite reviews

  12. SOPs and Templates • Develop Standard Operating Procedures • Easy to create and update • User-friendly • Develop templates • Audit report • Standard communications, e.g., letters/emails • Compile Common Observations

  13. An Overview of the Audit Process

  14. Preparation Phase

  15. Preparation • Identify Protocol (not for cause) • Notify Investigator et al., as appropriate • When to communicate, if at all (unannounced visit) • What to communicate (What, Who, How) • How to communicate, e.g., email, phone, website • Schedule Date/Time • All at once vs separate components, e.g., interview, onsite review • How much time -for investigators, support staff • Secure private space

  16. Preparation, cont. • Review IRB File, etc. • Protocol • Funding source • Study tools, e.g., survey • Consent documents (all versions) • External/Ancillary reviews, if any • Modifications to date • Reportable Information to date, e.g., AEs, Safety Reports, etc. • IRB determinations and concerns, if any • Directed/For-cause/Area of focus, e.g., participant payment • Prepare Tools, e.g., notes vs audit worksheet/checklist

  17. Onsite Engagement Phase

  18. Onsite Engagement Not For Cause Audit Tip: Ease into the review; develop rapport with the PI/study team. This will set the stage for an educational, not adversarial approach. • Introduction • Set the tone • QA/I Program (Scope, Authority) • Overall process, purpose • Type of audit, e.g., for-cause vs not-for-cause • Questions • Discussion vs Interview • PI to share challenges with study implementation, IRB

  19. Onsite Engagement, cont. • Orientation to organization system • Ensure access to all materials • Review of Regulatory Documentation • Document observations • Make photocopies, when necessary • Review of Participant files • File selection –how to determine sample size? • Be flexible –you may need to expand the sample

  20. How to Determine Sample Size • Consider the goal of the QA/I Program • 100% Compliance vs “Sampling” of Noncompliance • May depend on… • Specific institutional concerns • Audit trigger, e.g., Routine vs For-Cause audit; General vs Directed/Focused audit

  21. Sample Size, cont. • 100% Compliance – no sample plan needed • “Sampling” of Noncompliance – must identify a representative sample • Random – selection process occurs at random, with equally likelihood that one protocol would be identified vs another; lacks any pattern • E.g., http://www.random.org/sequences/ • Online tool that generates a sequence of numbers that can be matched to a protocol number

  22. Sample Size, cont. • Non-Random – You decide! • Informed by… • Institutional Environment/Policy • Prior Noncompliance/Trends • Other Industry Standards, e.g., 10% of total sample • Key stakeholders, e.g., IO, IRB, or investigators Tip: Whatever you chose, be transparent with the research community about your selection process

  23. Onsite Engagement, cont. • Exit Interview • What is the PI/study team doing well • What does the PI/study team need to work on • Summary of noncompliance • Explore “root cause”

  24. Report & Resolution Phase

  25. Report & Resolution • Compile Report • Provide in a timely manner, e.g., 5 business days • Recommended elements • Introduction (who, what, when) • Observations • Provide examples, frequencies • Corrective Actions • Best Practice Recommendations • Regulatory citations • Summary, e.g., “…regulatory document well organized…/…could use improvement…” TIP: Be concise, keep it factual, make it easy for investigators to understand what you found and how to correct it, use tables & bullets (avoid narrative)

  26. Report & Resolution • Create Report Template • Create common observations • Improves efficiency and consistency • Facilitates tracking and trending • Consider including • General Observations • Best Practice Recommendations • Corrective Actions • Regulatory citations TIP: Consider your institutional research portfolio, e.g., SBER, IND/IDE, etc., and seek input from stakeholders, e.g., IRB

  27. Report Distribution: A Case Study

  28. Report Distribution Example No potential seriousor continuing noncompliance observed PI; key study staff Not for Cause/Routine Potential serious or continuing noncompliance observed For Cause/ Directed • PI; key study staff • IRB • Institutional Official • Others, e.g., VP of Research, department chairs, Associate Dean of Students TIP: Share aggregate findings from not for cause audits with key stakeholders

  29. Follow-Up Phase

  30. Follow-Up Options • Obtain confirmation that correction actions have been implemented • QA/I staff conduct 2nd, 3rd, etc., audit • PI attestation, e.g., formal letter, eIRB system verification • Additional audits of investigator/department • QA/I staff provide support to PI to implement corrective actions/best practice recommendations • Education • General education offerings • Focused in-serve for PI and study team • PI evaluation of audit TIP: Consider your resources; conduct a risk-benefit analysis

  31. Metrics & Trends

  32. Audit Metrics & Trends • Collect data • Not-for-cause/for-cause audits • Common observations • Post-audit evaluation data • Basic demographics/characteristics • Analyze data • Track frequencies • Identify trends, e.g., isolated, pattern, systemic • Inform general education offerings • Share findings with key stakeholders

  33. Questions?

More Related