1 / 17

A typology of evaluation methods: results from the pre-workshop survey

A typology of evaluation methods: results from the pre-workshop survey. Rebecca Taylor 1 , Alison Short 1,2 , Paul Dugdale 1 , Peter Nugus 1,2 , David Greenfield 2 1 Centre for Health Stewardship, ANU 2 Centre for Clinical Governance, AIHI, UNSW. Outline of presentation. Background Aim

alva
Download Presentation

A typology of evaluation methods: results from the pre-workshop survey

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A typology of evaluation methods: results from the pre-workshop survey Rebecca Taylor1, Alison Short1,2,Paul Dugdale1, Peter Nugus1,2, David Greenfield2 1 Centre for Health Stewardship, ANU 2 Centre for Clinical Governance, AIHI, UNSW

  2. Outline of presentation Background Aim Method: How we developed and distributed the survey Findings So what does this mean? Future directions

  3. Background • At last year’s workshop participants reported a lack of clarity about how to evaluate CDSM tools, including what factors should be considered in evaluations • To foster the use of world class evaluations, first need to know: • what evaluations are currently being completed? • are there gaps in the types of evaluations currently completed? • how can evaluation projects be strengthened?

  4. Aim To investigate the types of evaluations conducted around Australia, and how these evaluations are reported to other clinicians and stakeholders

  5. Method • Development of a survey to investigate the types of evaluations conducted around Australia, and how these evaluations are reported to other clinicians and stakeholders • Pilot testing • Distribution of survey to attendees of the ‘Evaluating Chronic Disease Self-Management Tools’ workshop • Descriptive and thematic analysis of quantitative and qualitative survey data

  6. Findings: Demographics Respondent’s profession (n=20) Setting in which respondent works (n=20) – Participants were able to select more than one response *n=number of respondents *n=number of respondents

  7. Findings: Tools evaluated • Home Telemonitoring • Flinders Program • Health Coaching • CENTREd Model for Self-Management Support • cdmNET • My Health My Life Program • Stanford Program • Living Improvement for Everyone (LIFE) – adapted Stanford Program for ATSI people • Intel equipment for care innovations • COACH Program • AQOL • QLD ONI • The Continuous Care Pilot

  8. Findings: Reasons for evaluation *n=number of respondents

  9. Findings: Collaborators *n=number of respondents

  10. Findings: Data used *n=number of respondents

  11. Findings: Outcome of evaluation *n=number of respondents

  12. Findings: Dissemination of findings Outputs per person Average = 3.5 Range = 0 - 31 *n=number of respondents

  13. Findings: Perspectives of evaluation “Evaluation is an essential component of any tool introduced to a service. However, not only from the patient perspective, but how it influences (or not), health professional practice”. (Participant 5)

  14. Findings: Perspectives of evaluation • Seeing evaluation in context, as part of a process • Who participates in the evaluation? • Range of results • Interfacing with clinicians • Need for the right evaluation method for the purpose

  15. Findings: Perspectives of evaluation “It is time to develop evaluation tools that measure the changes in people and relationships that we are seeing every day when we work with these tools. It might not change someone's HbA1c overnight but it might mean they connect with family again or talk to their health professionals more or ask for help before crisis hits”. (Participant 16)

  16. So what does this mean? • When planning service delivery, also plan its evaluation (plan from the beginning) • Use a wide range of evaluation methods and ensure they are used in the appropriate context • Engage all of the stakeholders • Share methods and findings with others • Collaborate with and learn from others working in the field to prevent reinventing the wheel

  17. Thank you Rebecca Taylor, Postdoctoral Research Fellow Rebecca.Taylor@anu.edu.au

More Related