1 / 30

Participants’ contributions

Practical approaches for evaluating the impact of health and social care students/practitioners learning with, from and about each other. 22 February 2005, King’s College London (HE Academy). Practical Evaluating the impact Learning with, from and about. Evaluating IPL make IPL explicit

pelham
Download Presentation

Participants’ contributions

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Practical approaches for evaluating the impact of health and social care students/practitioners learning with, from and about each other.22 February 2005, King’s College London(HE Academy)

  2. Practical Evaluating the impact Learning with, from and about Evaluating IPL make IPL explicit update in the evidence from evaluations of IPE criteria against which one can measure impact Evaluation process Choices evaluation studies and the use of different methodologies evaluation methods for academic-based and practice-based learning Tensions methodological issues around ethics and access to students researching on and researching with people methods that are transferable to, for example, groups of busy professionals/managers working in more integrated services the organisational context within which learning happens balance of developing distinctive professional roles whilst also recognising the benefits of interprofessional working, and how to engage the students in this managing (far too much) data from the students Impact research issues that underpin impact evaluation interim findings translating researching with people into evaluative practices based on transforming multiprofessional teams impact our current interprofessional education has on future professional relationships impact on integrated care management and delivery Participants’ contributions

  3. Workshop plan 10 30-10 45 Introduction to the day MRH 10 45-11.45 Participants’ Challenges Plenary 11.45-12.00 Coffee 12 00-12 30 Evaluation Rigour MRH 12 30-13 00 Participatory inquiries and evaluation MH 13 00-14 00 Lunch 14 00-15 00 The Challenges Surgery Small groups 15 00-15 30 Meeting the Challenges Plenary 15 30-15 45 Tea 15 45-16 30 Evaluation Impact Plenary 16 30-17 00 Messages from the Day Plenary

  4. Learning with, about & from each other Interaction and action Your The day ahead challenges responses audiences

  5. Key issue (s) for your evaluation practice Questions Challenges Critical incidents Introduce yourself Outline of your IPE involvement issues identified in or for evaluation Your Challenges

  6. Workshop plan 10 30-10 45 Introduction to the day MRH 10 45-11.45 Participants’ Challenges Plenary 11.45-12.00 Coffee 12 00-12 30 Evaluation Rigour MRH 12 30-13 00 Participatory inquiries and evaluation MH 13 00-14 00 Lunch 14 00-15 00 The Challenges Surgery Small groups 15 00-15 30 Meeting the Challenges Plenary 15 30-15 45 Tea 15 45-16 30 Evaluation Impact Plenary 16 30-17 00 Messages from the Day Plenary

  7. Interprofessional Education • Members (or students) of two or more professions associated with health or social care, to be engaged in learning with,from and about each other. • An intervention to secure interprofessional learning and promote gains through interprofessional collaboration in professional practice.

  8. Paradigms Evaluation approaches Rigorous & robust Deliberate choices! positivist interpretive and illuminative change

  9. Evaluation ‘logic loop’ proving Quality improving Impact Outcomes Effectiveness

  10. Outcomes

  11. Hamdy et al. (forthcoming BEME systematic review) Predictive values of assessment measurements obtained in medical schools and future performance in medical practice Prospective or retrospective Unbiased selection of subjects Similarity of correlated construct Psychometric characteristics of measuring instruments Use of appropriate statistics Attrition bias proving

  12. improving Four guiding principles – that research should be: • contributory in advancing wider knowledge or understanding; • defensible in design by providing a research strategy which can address the evaluation questions posed; • rigourous in conduct through the systematic and transparent collection, analysis and interpretation of qualitative data; • credible in claim through offering well-founded and plausible arguments about the significance of the data generated Ref: HM Government Strategy Unit

  13. Making sense • Makes a contribution • Has a defensible design • Was conducted rigorously • Makes credible claims

  14. Contribution Assessment of current knowledge Identified need for knowledge Takes organisational context into account Transferability assessed Take the test

  15. Has a defensible design Theoretical richness Evaluation question (s) Clarity of aims and purpose Criteria for outcomes and impact Resources Chronology Take the test

  16. Conducted rigorously Ethics and governance Clarity and logic sampling data collection analysis synthesis judgements Take the test

  17. Take the test • Makes credible claims

  18. Take the test Reporting credible claims Reporting credible claims Reporting credible claims Reporting credible claims

  19. Workshop plan 10 30-10 45 Introduction to the day MRH 10 45-11.45 Participants’ Challenges Plenary 11.45-12.00 Coffee 12 00-12 30 Evaluation Rigour MRH 12 30-13 00 Participatory inquiries and evaluation MH 13 00-14 00 Lunch 14 00-15 00 The Challenges Surgery Small groups 15 00-15 30 Meeting the Challenges Plenary 15 30-15 45 Tea 15 45-16 30 Evaluation Impact Plenary 16 30-17 00 Messages from the Day Plenary

  20. Participatory inquiry and evaluation • Uncertainty and tensions • Dimensions of participation • Ethics and the politics of invitation • Approaches • Rapid (organisational) appraisal • ‘Stakeholder’ evaluations • Action for Rigour “If you are a fish what can you know about water ?”

  21. Why do participatory evaluation? • Short term weakness: ‘objectivity’, weighting • But … long term strength: capacity building, credibility • Richness • Homology with interprofessional/collaborative practice?

  22. The Evaluatorscope as a metaphor for evaluators No simple solutions to complex issues and no point in looking for them!

  23. Uncertainty, tensions, challenges • Widening boundaries is “swimming into an unknown current”(Moustakas, 1990) • Conflicting expectations • Governance • Incompleteness • Redundancy of data (and too much) • £/time

  24. Dimensions of participation (Dick 1997)

  25. Ethics and the politics of invitation • Trust, respect, purposeful working relationships (!) • Methods theorised and politicised • Drawing the boundary – who is in and who is out • Responsibility and respons-ability • Reportage and ownership

  26. Example 1: Stakeholder evaluations • Critical evaluation • Up-front principles • Mixed methods • Visual energy (3rd point of reference, ‘power to’) • Reporting for difference – levels and language

  27. Example 2: Rapid (institutional) appraisal • Purpose • Appreciative inquiry in the present • Working out the details (differences, protocols …) • Creative management …. • A climate in which it is safe to experiment • Interdisciplinary working groups, with external resource persons • Regular documentation and analysis scaling up

  28. Action for rigour • Methods in their (methodological, epistemological) contexts • Visible boundaries • ‘Recoverability’ (Checkland and Holwell 1998)

  29. The Challenges Surgery • Small groups • Take the role of critical friend/external advisor for at least three of the Challenges • Recommend ways forward • Suggest an action plan

  30. The Challenges Surgery Four groups Discussion questions: • What does impact really mean in terms of interprofessional learning and teaching? • What difference does scale make? Effective evaluation strategies for small, medium and large scale IPL&T • What are the epistemological and philosophical tensions in evaluating IPL&T? • How do you find out what it is to be interprofessional? How could this be meaningful for student assessment and evaluating IPE?

More Related