1 / 15

Tracking, Assessment and Evaluation Program

Tracking, Assessment and Evaluation Program. Kathryn Nearing, PhD Jeffrey Proctor, MBA Lead External Program Evaluators The Evaluation Center, School of Education and Human Development. Governance. Tracking, Assessment & Evaluation Nearing and Proctor. Specific Aims relevant to TEC.

karsen
Download Presentation

Tracking, Assessment and Evaluation Program

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Tracking, Assessment and Evaluation Program Kathryn Nearing, PhD Jeffrey Proctor, MBA Lead External Program Evaluators The Evaluation Center, School of Education and Human Development

  2. Governance Tracking, Assessment & Evaluation Nearing and Proctor

  3. Specific Aims relevant to TEC Research Environment: • Catalyze quality and process improvement and ascertain key program impacts through responsive tracking, assessment and evaluation. (SA4) Resources and Services: • Develop and implement a Program Income System and an evaluative process for cost-effective allocation of resources that supports the planning, conduct, analysis, and dissemination of research results. (SA1) • Optimize the infrastructure for implementing and tracking research and support services. (SA2)

  4. ETCD: Four over-arching questions aligned with component themes • What is the evidence that the ETCD pillar effectively recruits and supports the retention of a diverse workforce? (recruitment and retention, inspire academic persistence) • How effectively does the ETCD pillar address the workforce development needs of diverse investigative communities, locally and nationally? (build future workforce, forge collaborations) • Are trainees, scholars adequately prepared for productive careers in clinical translational research? (expand the ETCD portfolio, enhance efficiency) • What is the evidence that we are promoting a culture of effective mentorship? (mentorship)

  5. Leveraging Investment of First 5 Years (a few examples) • Needs assessment, workflow analyses informed priorities that will serve as the focus of QPIP initiatives • Existing processes, instruments (e.g., pilot grant tracking expanded to include CSU pilot projects) • Defined indicators, metrics, data points, and feasibility studies regarding their collection, will inform utilization of robust, centralized data systems/tools • Refinement of existing, and development of new, ETCD programs based on evaluation

  6. EAC Critiques • Avoid duplication of effort with QPIP • Enhance evaluation of ETCD programs (invest more resources in this area) • Actively contribute to the establishment of national metrics • Move toward a more outcome/impact-oriented evaluation; include an assessment of CCTSI’s impact on health status indicators (utilizing “big data”)

  7. Approach: Research Environment • Leverage evaluation findings from first grant cycle to identify priorities • Key Priority: Increase efficiency of study start-up (scientific review, regulatory approval, and enrollment of study subjects) • Leverage the enhanced data infrastructure to assess impact of QPIP initiatives

  8. Approach: Research Environment Figure: TEC role in catalyzing quality and process improvement, and assessing the results, to inform data-driven decision making Report Findings to EC & QPIP EC & QPIP Discuss Implications Implement QPIP Initiatives Collect & Analyze Data Reevaluate to Determine Impact of Changes Improved Quality, Cost Effectiveness, Efficiency, Innovation and Safety

  9. Approach: Resources and Services • Leverage enhanced data infrastructure, as well as formal needs assessment process, to collect relevant metric-level data systematically • Utilize a balanced scorecard approach to assess value and inform strategic allocation of limited resources

  10. Approach: Resources and Services Figure: Balanced Scorecard Approach

  11. Approach: ETCD • Strategic planning • Comprehensive examination of data, instruments, key findings • SOW developed collaboratively with steering committee based on established priorities • Implementation of Scope of Work • Revised all instrumentation to support implementation • Progress is reviewed monthly at steering committee meetings

  12. ETCD: Four over-arching questions aligned with component themes • What is the evidence that the ETCD pillar effectively recruits and supports the retention of a diverse workforce? (recruitment and retention, inspire academic persistence) • How effectively does the ETCD pillar address the workforce development needs of diverse investigative communities, locally and nationally? (build future workforce, forge collaborations) • Are trainees, scholars adequately prepared for productive careers in clinical translational research? (expand the ETCD portfolio, enhance efficiency) • What is the evidence that we are promoting a culture of effective mentorship? (mentorship)

  13. National CTSA Consortium Involvement • Evaluation Key Function Committee • Mentored to Independent Investigator Workgroup • Provided data related to national survey effort • Are collaborating to interview former KL2 scholars to address key evaluation questions • Team Science Affinity Workgroup • Presented two paper presentations as part of CTSA panels at the AEA (workflow analysis, team science) • Pipeline Framework manuscript in development

  14. Assessing CCTSI Impact (One Strategy) • Leverage longitudinal pilot grant tracking (e.g., with community engagement awardees) • Identify projects that meet specific criteria: • Adopted/adapted an evidence-informed strategy • Implementation and dissemination occurred in a discrete (well-defined) geographic area/population • Targeting very specific (granular) health indicators • Examine trends in relation to implementation (compare historically and with state)

  15. Questions for EAC • Are responses to critiques deemed adequate? • Do you have additional thoughts about how to assess the impact of the CCTSI on community-level health status indicators? • Would you be willing to stay connected as we move such efforts forward (in between EAC meetings)?

More Related