1 / 17

Rethinking evaluation

Rethinking evaluation. Heather King King’s College London. Why do evaluation?. To improve the activity To improve your own practice To identify new insights and ‘unanticipated consequences’ Not a test of success or failure, but a tool to help. 1. ‘To improve the activity’.

Download Presentation

Rethinking evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Rethinking evaluation Heather King King’s College London

  2. Why do evaluation? • To improve the activity • To improve your own practice • To identify new insights and ‘unanticipated consequences’ Not a test of success or failure, but a tool to help

  3. 1. ‘To improve the activity’ Action research – research that leads to change in practice through an immediate feedback cycle that alternates between action and the study of those actions. Action Evaluation • incorporates evaluation as an ongoing and natural component of planning and implementation of an activity • places the practitioner in a central role within the evaluation

  4. ‘Backwards’ approach 1. What is the aim? What is your ‘vision’ 2. How will you know that you are achieving your aim? What will you see / hear / be able to record? (Indicators) 3. What can you do to achieve your aim – ie what programmes/activities do you need to develop?

  5. 1. Using evaluation to plan

  6. 2. Using evaluation to plan

  7. 3. Using evaluation to plan

  8. Indicators What will visitor gain as result of experience? • Learn new content • Affective experience • Attitude change • New ideas • Inspiration • Physical / emotional / social skills? To capture ‘indicators’: Pre and post tests; observations; interviews;

  9. 2. To improve own practice Evaluation as reflection of: -Personal practice -Team practice Need to be open to idea of improvement Need to acknowledge that there is a problem (and room for improvement)

  10. Techniques for monitoring practice • Video self in action (note ethical considerations of filming visitors) • Record interactions (audio only) • Keep notes / journals of practice

  11. 3. To identify new insights and unanticipated outcomes Note - Outcome based evaluations: • neglect the significance of unintended outcomes • Narrow what the institution is offering by changing services in response to feedback to a limited set of objectives Pekarik argues for participant-based evaluation: ask about processes, settings, needs, values, barriers etc Find out not only what happened by why Pekarik, A.J. (2010). From knowing to not knowing: moving beyond ‘outcomes’. Curator 53: 1 105-115

  12. Reframing assessment Type 1 assessment: Pre and post tests – designed to test specific information about a learner Type 2 assessment: observations; interviews - but still framed within a set of pre-determined judgments Type 3 assessment: in-situ activity but not framed by any particular judgement as to what counts. A shift from looking at what is in the learner’s heard to looking at learner’s performance and meaningful participation. (Michalchik, V. & Gallagher, L. (2010). Naturalizing assessment. Curator, 53/2 209-219)

  13. Your turn.. 1. To improve the activity: What indicators would you look for? 2. To improve your own practice How would you monitor your practice? 3. To identify new insights and ‘unanticipated consequences’ What do you think about the type 3 assessment? How would you do it?

  14. How to do evaluation – a quick overview • Front-end – what the audience already knows, what they don’t know, what they are interested in • Formative – exposure (of prototypes) to audiences to discover what works, does not work • Remedial – evaluation near end of exhibition development, performed so that may make one last round of improvements • Summative – evaluation of final product - Quantitative – answers how much did we do • Qualitative – answers how well did we do By conducting both are able to provide a well rounded synopsis of the impact and success of the activity

  15. Methods Quantitative (though may include qualitative elements) • Experimental studies – pre and post tests of knowledge gain • Surveys / questionnaires – qualitative if contain open-ended questions Qualitative • Ethnography: study of complex behaviour of group over time • Observations: ethnographic in approach, but over duration of session • Case study: provides an in-depth view of one person, serves to elaborate quantitative data • Participant materials eg reflective journals, post it notes, writing, drawings, photos) • Interviews (note – need training for effective interview techniques): these could be ‘snapshot’ ie short five minutes, or recall conducted one/two weeks later • Focus groups ( note – need training)

  16. Making sense of data • Relate quantitative to qualitative - use one to help explain the other • Compare findings from different measures • Interpret the findings in relation to particular context • Present findings so they are readable • Use the findings to inform practice

More Related