1 / 64

Evaluation as a Subversive Activity (Demonstrating the Value of Career Services)

This article discusses the value of incorporating evaluation into career services and highlights the subversive approach to challenging funders' attitudes and providers' practices. It provides a comprehensive framework for evaluating career interventions and emphasizes the need for evidence-based outcome-focused practice.

lfortier
Download Presentation

Evaluation as a Subversive Activity (Demonstrating the Value of Career Services)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation as a Subversive Activity (Demonstrating the Value of Career Services) Kris Magnusson Dean of Education Simon Fraser University Bryan Hiebert Educational Psychology & Leadership Studies University of Victoria

  2. Overview • Background information • Evaluation approach • Implementation planning • Some sample tools

  3. Background and Rationale

  4. A Challenge by Canadian Policy Makers: “You haven’t made the case for the impact and value of career development services!” A research team formed in 2004 to follow-up • The Canadian Research Working Group for Evidence-Based Practice in Career Development • 10 researchers from 7 universities & 1 foundation

  5. Why “Subversive”? • The goals of funders of service do not always align with the goals of service providers • Funders may not recognize legitimate and useful results of service (e.g., progress towards a goal) • Practitioners often believe that they are too busy, and that “evaluation” is too complex • Evaluation is rarely planned at the DESIGN stage, where it can be most effective; it is usually “bolted on” to a program

  6. How to address the problem We need an approach that is: • Comprehensive enough to include what is needed • Simple enough for people to use; and • Incorporates evaluation into standard practice

  7. What Gets “Subverted”? • Changed attitudes from funders • Changed practice from service providers • More evidence for the field to justify the power and efficacy of career interventions

  8. General Approach to Evaluation Showing worth, documenting impact, relating success stories ….

  9. Evidence-Based Outcome-Focused Practice InputProcessOutcome Need to link process with outcome

  10. Evidence-Based Outcome-Focused Practice InputProcess Outcome • Indicators of client change • 1. Learning outcomes • Knowledge and skills linked to intervention • 2. Personal attribute outcomes • Changes in attitudes • Intrapersonal variables (self-esteem, motivation, independence) • 3. Impact outcomes • Impact of #1 & #2 on client’s life, e.g., employment status, enrolled in training • Societal, economic, relational impact

  11. Outcomes of Counselling • Client learning outcomes • Knowledge • Skills • Impact on client’s life • Client presenting problem • Economic factors • Third party factors +PrecursorsPersonal Attributes

  12. Precursors Intervene between learning outcomes & impact outcomes • Attitude • Motivation • Self-esteem • Stress • Internal locus of control • Belief that change is possible

  13. What outcomes are you achieving that are going unreported or unmeasured? • Client empowerment • Client skill development • personal self-management skills • Client increased self-esteem • Client changes in attitudes • about their future • about the nature of the workforce • Client knowledge gains • Financial independence • Creation of support networks • More opportunities for clients These are legitimate areas for intervention

  14. Evidence-based Outcome-focused Practice InputProcess Outcome • Activities that link to outputs or deliverables • Generic interventions • Working alliance, microskills, etc. • Specific interventions • Interventions used byservice providers • Skills used by service providers • Home practice completed by students • 2.Programs offered by schools or agencies • 3. Involvement by 3rd parties 4. Quality of service indicators • Stakeholder satisfaction, including students

  15. Evidence-based Outcome-focused Practice InputProcessOutcome • Resources available • 1. Staff • Number of staff, level of training, type of training • 2. Funding • Budget • 3. Service guidelines • Agency mandate 4. Facilities 5. Infrastructure 6. Community resources

  16. Inputs • Number of staff • Staff level of training • % or staff time on various tasks • Staff: client ratio • Client case load • Program dollars • Client volumes • Types of client problems • Adherence to mandate • Cost-effectiveness Be realistic about what you can deliver, given the level of resources you can commit

  17. Outcome Focused Evidence-Based Practice Quality Improvement InputProcessOutcome Activities Counsellor & Client Resources • Client change • Knowledge • Skill • Attribute • impact

  18. Outcome Focused Evidence-Based Practice Dynamic and Interactive Resources Counselling, Linking process to outcome Process Outcome

  19. Intervention Planning & Intervention Evaluation Intervention Planning Framework Client Outcomes• Knowledge• Skills• Attributes• Impact Context:Client Needs Client Goals Counsellor Strategy Client Strategy     1 3 2 20

  20. Client • Context • Needs • Goals Outcome Focused Evidence-Based Practice • Resources available • Context: Structure of opportunity • Staff: Number of staff, level of training, type of training • Funding: Budget • Service guidelines • Facilities • Infrastructure • Community resources • Activities that link to outcomes or deliverables • Generic interventions • Working alliance, microskills, etc. • Specific interventions • Strategies linked to specific client problems(stress, grief, depression, career, etc.) • Client home practice • Other • Programs & Workshops • Facilitation guides • Intervention manuals • External Referral PROCESSES INPUTS OUTCOMES • Indicators of client (Learner) change • Learning outcomes • Changes in knowledge and skills linked to the program or intervention used • Progress Indicators End Result Indicators • Personal attribute outcomes • Changes in intrapersonal variables e.g., attitudes, self-esteem, motivation, etc. • Progress Indicators End Result Indicators • Impact Outcomes • Changes in the client’s life resulting from application of learning

  21. Accessibility Regular hours Extended hours Physical accessibility Resources in alternate format Ease of access, who can access Timeliness % calls answered by 3rd ring Wait time for appointment Wait time in waiting room System requirements Adherence to mandate Completion of paper work Service standards Staff credentials, competencies, resources Service delivery Client volumes Client presenting problems Number of sessions Responsiveness Respect from staff Courteous service Clear communication Overall satisfaction % rating service good or excellent % referrals from other clients Quality Service Delivery Need to negotiate these with funders

  22. Negotiated outcomes • If you are lucky, funders might identify personal attributes [client motivation, improved job satisfaction, increased self-confidence] or knowledge, or skills, as accountability indicators • BUT more likely funders will identify impact outcomes [employment status, enrolment in training, reduced # of sick days, increased productivity, etc.] or inputs [client flow, accessibility, timeliness of paper work, etc.] • So service providers need to identify the knowledge, skills, personal attributes that will produce the impacts and negotiate these as accountability indicators Be careful what you promise to deliver BUT deliver what you promise

  23. Outcome Focused Evidence-Based Practice InputProcessOutcome Need to link process with outcome • What will I do? • What are the expected client changes? • What do I expect clients to learn? • What sorts of personal attributes do I want my clients to acquire? • What will be the impact on their lives? • How will I tell? • Professional Practitioner

  24. Model Summary:Integrating Evaluation into Practice • Evaluation is part of service delivery • Not bolted on the side of service • Service = Intervention + Outcome

  25. Implementation Planning

  26. Integrating Evaluation into Practice • Understand Service Foundations &Client Context • Describe Desired Outcomes • Describe Core Activities • Select Measures and Scales • Collect Evidence • Work with the Data • Report Results & Market your Services

  27. Understanding Service Foundations & Client Context What factors outside of the intervention might have an impact on the results? • Nature of services • Context of service

  28. Describing Desired Outcomes • What do we want to achieve? • Type of client change being sought? Be careful to avoid attempting to do too much in too short a time frame

  29. Describe Core Activities • What do we (service provider and client) need to do to achieve the outcomes? • Link services to outcomes

  30. Select Measures and Scales • What will be the indicators of success? • Process factors • How well did the service provider deliver the service as intended? • How well did the participants follow the program as intended? • Outcome factors • Indicators of progress + ultimate indicators of success • Knowledge • Skills • Personal attributes • Impacts

  31. Collect Evidence • How do we collect evidence most efficiently? • Process assessments should not take more than 5 minutes to complete • Outcome assessments should not take more than 10 - 15 minutes to complete

  32. Work with the Data • How do I make sense of the data I have collected? • Frequency counts and percentages • Mean scores • Measures of association In most cases, sophisticated statistics are not necessary

  33. Report Results & Market your Services • How do we use the data to convince others? • Work from the macro to the micro • Demonstrate movement in your results • Ensure that decision-makers actually see the results of your work Tell the people who need to know in language that they can understand

  34. Points to Ponder • Consider these Assessment Alternatives • For group (workshop) interventions • For individual interventions • Not saying these are better (or worse) than what you are currently doing • Only encouraging you to consider other alternatives

  35. Examples of Subversive Tools and Approaches

  36. Assessment as Decision Making (vs. Judgement) Please use a two-step process • Would you say that your level of mastery of the attribute under considerations is • Then assign the appropriate rating • 0 = really quite poor • 1 = just about OK, but not quite • 2 = OK, but just barely • 4 = really very good • 3 = in between barely OK and really good unacceptable acceptable 0 1 2 3 4 0 4

  37. Points To Ponder Trustworthiness … • Inter-rater reliability • Would 2 different people give the same ratings? • Intra-rater reliability • Would the same person give the same rating on 2 different occasions? More choice alternatives is not necessarily better

  38. Problem with skill self-assessment • Participants asked to rate their skill (or knowledge) before and after a program • Often, pre-workshop scores are high and post-workshop scores are lower • People find out as a result of the workshop that they knew less than they thought or had less skill than they thought • Based on the new awareness, post-scores are lower • People don’t know what they don’t know • How can we get around this problem?

  39. unacceptable acceptable 0 1 2 3 4 Assessing Learning & Attribute Outcomes Post-Pre Assessment • We would like you to compare yourself now and before the workshop. Knowing what you know now, how would you rate yourself before the workshop, and how would you rate yourself now? • Please use a two-step process: • Decide whether the characteristicin question is acceptableor unacceptable, then • assign the appropriate rating 0 4

  40. Orientation Workshop: Summative Evaluation

  41. Orientation Workshop: Summative Evaluation Results • 156 ratings (6 questions times 26 people): • 84 (54%) ratings were unacceptable before the workshop • 0 ratings were unacceptable after the workshop • 6 (4%) ratings were excellent before the workshop • 108 (69%) ratings were excellent after the workshop • Mean scores before and after the workshop • Before, all were unacceptable (<2) • After, all were more than minimally acceptable (>3)

  42. Orientation Workshop: Formative Evaluation Unacceptable Acceptable

  43. Process Results & Formative Feedback Big Picture, Core Motivators, De-Motivators were most engaging & most useful. Possible Career Options and Reality Check were least engaging & least useful, perhaps because participants has done similar exercises before.

  44. Post-Pre Assessment: Bridging Health Care with Self-Care • Knowledge of stress and stress control • Knowledge of how to manage personal change • Knowledge of nutrition and nutrition control - - - - - - - - - - - - - - • Level of stress • Level of nutrition (high=healthy) • Level of fitness • Confidence in ability to manage personal change Presenting the results

  45. Participant Self-Assessed Change

  46. Post-Pre ResultsBridging Health Care with Self-CareStress & Stress Control Start of Program End of Program Vertical axis indicates % of participants

  47. Informal Evaluation of Headache, Pain, and Related Affective States

  48. Self-Monitoring Headache 0 - No headache 1 - Low level, only enters awareness when you think about it 2 - Aware of headache most of the time, but it can be ignored at times 3 - Painful headache, but still able to continue job 4 - Severe headache, difficult to concentrate with demanding tasks 5 - Intense incapacitating headache

  49. Headache Monitoring Grid Level 5 4 3 2 1 0 06 07 08 09 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 01 02 03 04 05 (time of day) Before treatment After treatment

More Related