Clinical and Translational Science Awards (CTSA)
Download
1 / 14

August 20, 2008 - PowerPoint PPT Presentation


  • 106 Views
  • Uploaded on

Clinical and Translational Science Awards (CTSA) CTSA Evaluation Approach Institute for Clinical & Translational Research (ICTR) https://ictr.wisc.edu University of Wisconsin, Madison (UW-Madison) www.wisc.edu Marshfield Clinic Research Foundation (MCRF)

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' August 20, 2008' - gayora


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

Clinical and Translational Science Awards (CTSA)

CTSA Evaluation Approach

Institute for Clinical & Translational Research (ICTR)

https://ictr.wisc.edu

University of Wisconsin, Madison (UW-Madison)

www.wisc.edu

Marshfield Clinic Research Foundation (MCRF)

http://marshfieldclinic.org/research/pages/index.aspx

August 20, 2008

This document is confidential and is intended solely for the use and information of the client to whom it is addressed.


Institute s resources and organization for evaluation
Institute’s Resources and Organization for Evaluation

  • Evaluation Team organizationally located in the ICTR Administrative Core and ICTR Client Services Center (ICSC)

  • D. Paul Moberg, PhD, Assistant Director, Tracking & Evaluation, ICTR/Madison (18% time)

  • Jan Hogle, PhD, Evaluation Researcher, ICTR/Madison (100% time)

  • Jennifer Bufford, Evaluation Coordinator, ICTR/Marshfield (30% time)

  • To be hired: Evaluation Research Specialist, ICTR/Madison (100% time)

  • This is a reduction from the proposed 3.35 FTE, corresponding to NIH budget constraints


Overview of uw ictr s evaluation goals
Overview of UW-ICTR’s Evaluation Goals

  • Collaborate with national and local stakeholders to

    • conduct self-evaluation of ICTR

    • track trainees and activities

  • Incorporate an approach that is

    • utilization-focused (intended uses by intended users) – logic model http://www.uwex.edu/ces/pdande/evaluation/evallogicmodel.html

    • participatory (ICTR stakeholders; Evaluation Working Group)

    • methodologically flexible (quan/qual; not doing experimental design)

  • Apply the evaluation process and findings to

    • priority-setting

    • program accountability

    • continuous quality improvement efforts


Objectives of ictr evaluation
Objectives of ICTR Evaluation

In order to achieve the goals, ICTR Evaluation:

  • Develops and implements ICTR’s cross-component evaluation plan and provide support for managing and analyzing central ICTR databases.

  • Provides evaluation consultation services to ICTR’s 25+ components, as well as to collaborating institutions, as time and funding allow.

  • Interfaces with national CTSA evaluation activities; participate in CTSA Consortium sponsored collaboration.


Approach to CTSA Evaluation matches CDC’shttp://www.cdc.gov/eval/framework.htmFramework for Program Evaluation in Public Health. MMWR 1999;48(No. RR-11)


Ictr evaluation office year 1 activities the proposal said and we accomplished 1
ICTR Evaluation Office Year 1 Activities: the proposal said… and we accomplished (1):

  • Partially staffed Evaluation Office (current budget issues)

  • Obtained stakeholder input, develop consensus on roles, responsibilities (Evaluation Working Group)

  • Developed common understandings of each component’s goals & objectivesvia meetings on Component Tracking Tables

  • Developed definitions for evaluation-related terms and other concepts for ICTR-wide use


Ictr evaluation office year 1 activities the proposal said and we accomplished 2
ICTR Evaluation Office Year 1 Activities: the proposal said… and we accomplished (2):

  • Developed central ICTR databases & tracking systems collaboratively with IT resources & ICTR components (Member DB; Request for Consult DBs; DBs for Investigators, Pubs, Grants; APR data tracking system; Resource Tracking Systems in individual components)

  • Interpreted APR requirements; set data collection mechanisms and trouble-shooting systems in place collaborativelywith ICTR Administration

  • Began to refine/prioritize/develop cross-component evaluation plans – “what would a successful institute look like”


Ictr evaluation office year 1 activities additional accomplishments 3
ICTR Evaluation Office Year 1 Activities:additional accomplishments (3):

  • Assisted with creation and evaluation plan design for the ICTR Client Services Center (ICSC)

  • Collaboratively developed guidelines for Case Studies Collection (qualitative descriptive summaries) to tell the story of translational research at UW-Madison and Marshfield (MCRF)

  • Co-led development of Resource Tracking System (RTS) with Biostatistics & Bioinformatics Core (BBI) and other ICTR components


Ictr evaluation office year 1 activities additional accomplishments 4
ICTR Evaluation Office Year 1 Activities:additional accomplishments (4):

  • Participated in Nat’l CTSA Consortium calls, Evaluation Steering Committee mtg, Wiki, Working Groups

  • Collaborated with ICTR Admin on refinements to Member Database

  • Began planning for Annual Member Survey and Key Informant Interviews (analysis in progress)

  • Collaborating with Marshfield on tracking & evaluation coordination


Summary of evaluation metrics 1
Summary of Evaluation Metrics (1)

Long term:

  • Improvement in key health indicators [SHOW – Survey of the Health of Wisconsin]

    Medium term:

  • “Silo removal” so that multidisciplinary & translational approach becomes the norm for health sciences research

  • Cadre of researchers reflects more closely the gender, racial & ethnic diversity of the US population

    Short term:

  • Reduction in time from IRB submission to approval

  • Reduction in number of IRB deferrals and modifications

  • Reduction in number of protocols withdrawn by the IRB for quality issues

  • Increase in satisfaction of users and of IRB staff and committee members


Summary of evaluation metrics 2
Summary of Evaluation Metrics (2)

Short term (cont’d):

  • # and types of Members in the Web Portal Member Database (800+ members)

  • # and descriptors of investigators/mentors/scholars reported via APR whose research has benefited significantly from CTSA resources (n=300+)

  • # publications based on research that benefits from CTSA/ICTR resources, annually

  • # and $ grants representing research that benefits significantly from ICTR resources, annually

  • # and $ of pilot grants awarded annually (2 rounds awarded in April & June 2008)

  • % of grants obtained, based on research that benefits, which are Type 2 translational

  • Feedback from ICTR members on services provided via Annual Member Survey

  • Qualitative assessments: Key Informant Interviews, Case Studies Collection, ICTR Client Services Center

  • Database analysis: Members, Request for Consults, Resource Tracking Systems


Ictr evaluation year 2 proposed work plan 1
ICTR Evaluation:Year 2Proposed Work Plan: 1

  • Evaluation Working Group – developing cross-component metrics

  • Operationalize measures & develop strategies for evaluating ICTR goals and specific aims

  • Implement Annual Member Survey preceded by key informant interviews

  • Begin to assemble Case Studies Collection

  • Collect and report on user feedback from ICTR “front door” and Web Portal

  • Continue to refine Resource Tracking System(s)


Ictr evaluation year 2 proposed work plan 2
ICTR Evaluation: Year 2Proposed Work Plan: 2

  • Assist with analysis of ICTR databases (Member, Consult, Grants, Pubs)

  • Continue to assist components with internal evaluation tasks

  • Participate in evaluation of ICTR Client Services Center (ICSC)

  • Participate in CTSA Consortium Working Groups & Steering Committee

  • Continue to support Annual Progress Reporting with Wiki-based data collection system


Institution evaluation challenges and or questions
Institution Evaluation Challenges and/or Questions

  • Operationalizing & prioritizing measures & indicators

  • Evaluation Office staffing and funding for evaluative studies--prioritize

  • Size and complexity of ICTR

  • Lack of consensus on database development: purpose, process, organization, and use: database development forces structural development; multiple & varied needs of 25+ components

  • Defining and tracking how “research” has “benefited significantly” from CTSA “resources” for the APR

  • Adapting evaluation plans to fit emerging realities.


ad