1 / 16

Evaluation of Phase I Demonstrations of the Pharmacy Quality Alliance

AHRQ Annual Conference September 16, 2009 Laura Pizzi, PharmD, MPH Associate Professor and Director, Doris N. Grandon Center for Health Economics and Outcomes Research Jefferson School of Population Health Philadelphia, Pennsylvania USA Telephone: (215) 955-1159

scott-rush
Download Presentation

Evaluation of Phase I Demonstrations of the Pharmacy Quality Alliance

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. AHRQ Annual Conference September 16, 2009 Laura Pizzi, PharmD, MPH Associate Professor and Director, Doris N. Grandon Center for Health Economics and Outcomes Research Jefferson School of Population Health Philadelphia, Pennsylvania USA Telephone: (215) 955-1159 Email: laura.pizzi@jefferson.edu Note: The timelines presented herein are dependent on OMB approval. Evaluation of Phase I Demonstrations of the Pharmacy Quality Alliance

  2. Introduction • Evaluation of PQA Phase I Demonstrations: • Provides independent external evaluation • Facilitates within- and across-site comparisons • Core evaluation team members: • Joyce McMahon, PhD - Project Director (CNA) • Elizabeth Schaefer, MA - Associate Project Director (CNA) • Laura Pizzi, PharmD, MPH - Associate Project Director (Jefferson) • Erin Thomson, MPH - Research Analyst (Jefferson) • AHRQ Project Officer: Linda Greenberg, PhD

  3. Key Evaluation Questions • What is the most efficacious way to collect and aggregate data on the 15 pharmacy performance measures and from the consumer questionnaire about pharmacy services? • What challenges, issues, and technical problems were encountered in creating and populating the template reports? How were they resolved?         • How could the demonstration sites have strengthened their measurement efforts? • How well were the reports understood by users (i.e., staff pharmacists, pharmacy managers)? • How can the PQA report template and the reporting process be improved (e.g., user-friendliness, comprehension, ability to act)? • What are the operational costs and non-financial burdens encountered in collecting the data, generating reports, and using the performance data?

  4. Early Focus • Establish lines of communication (AHRQ, PQA, demonstration sites) • Monthly teleconferences between AHRQ, CNA, Jefferson, and PQA • Monthly teleconferences with project leaders at each of the 5 sites to monitor progress • Primary data requirements • Instruments: Interview Guide and Pharmacist Survey • Both require OMB approval

  5. Evaluation Components • Verification of findings via secondary data analysis (data already collected during the demonstration itself): • Performance reports (claims data provided by the sites) • Pharmacy consumer survey (survey data provided by Avatar) • New information gathered via primary data: • Monthly teleconferences • Qualitative on-site interviews of demonstration project staff • Quantitative paper survey of field pharmacists

  6. Interviews of Demonstration Staff • With the help of site Project Leaders, CNA team has identified six (6) interviewees per site, representing specific job functions: • Demonstration project leadership • Pharmacy operations / management • Analytics management responsible for oversight of performance report analyses • Analytics staff assigned to complete the performance reports • Information technology (IT) staff responsible for developing and/or coordinating Inter- or Intranet components of the project • Senior management (executive leadership)

  7. Interviews of Demonstration Staff, continued • Conducted in person by a two-person interview team from Jefferson • Duration of each interview will be approximately 1 hour • Total duration of each site visit will be 2 business days • Approximate timeline: • Mid-February to mid-May 2009: CNA team worked with Project Leaders to identify interviewees • September 2009: Site visits scheduled • October 2009: Site visits conducted

  8. Survey of Field Pharmacists • Pharmacist sample currently being obtained with the help of site Project Leaders • Sampling parameters • Inclusion criterion: pharmacists are required to have participated in the PQA Phase I Demonstration project as recipients of the performance report • Sample size: 100 field pharmacists per site, except for sites where fewer than 100 pharmacists participated • Survey will require approximately 30 minutes to complete

  9. Survey of Field Pharmacists: Timeline • Mid-July to October, 2009: CNA team works with demonstration sites to identify sample. Also sends sampled pharmacists a formal letter* to: • Inform them that they have been selected as participants for AHRQ-funded PQA evaluation • Explain the purpose of the survey and anticipated time required for completion • Assure them confidentiality and encourage participation • Mid-October: CNA team sends the survey by mail to sampled pharmacists. The mailing will include: • A cover letter* to remind pharmacists of the purpose of the survey and anticipated time requirements, and assure confidentiality • The survey instrument, for completion • A postage-paid envelope for returning the completed survey • Late October: CNA team sends a reminder letter* prompting pharmacists to complete and return the survey *Letters will be sent on AHRQ letterhead and signed by Project Officer with reinforcement message sent by email to the pharmacists by their Demonstration Project Leader(s)

  10. Evaluation Constructs: Primary Data

  11. Evaluation Constructs: Primary Data (continued)

  12. Evaluation Constructs: Primary Data (continued)

  13. Evaluation Constructs: Primary Data (continued)

  14. Pilot Test of Evaluation Tools • Goal of pilot testing was to obtain input for refining the interview guide and pharmacist survey in terms of both content and process • Pilot test was completed February 2009 (prior to OMB submission) • Conducted with pharmacystaff from the Jefferson Health System (JHS) • JHS delivers both inpatient and outpatient pharmacy services through approximately 80 full time pharmacists, with care delivered at two hospitals (totaling approximately 800 beds) on the main campus in Center City, Philadelphia • JHS operates 3 outpatient pharmacies, which are open to the public and are operationally similar to typical retail pharmacies

  15. Pilot Test of Evaluation Tools, continued • Participants from the pilot site were provided with the training materials developed by one of the demonstration sites, plus a mock performance report based on an actual de-identified pilot site report • There were 2 pilot test cohorts: • Cohort 1 (5 individuals) participated in testing the demonstration staff interview guide: • Analytics/IT management and staff • Pharmacy department management • Clinical pharmacy staff • Quality improvement • Senior management (executive leadership) • Cohort 2 (5 practicing pharmacists) participated in testing the pharmacist survey

  16. Tasks Following Data Collection • Analyze all information • Primary data collected: interviews and pharmacist survey • Secondary data acquired: consumer survey and claims data • Information collected via teleconferences with PQA and sites • Prepare case studies • Concentrate on both within-case and cross-case studies • Cross-case studies will focus on generalizability of findings • Share findings with AHRQ, PQA, and demonstration sites • Publication • Primary manuscript focused on evaluation findings (pooled data) • Submitted to major health policy or quality journal

More Related