1 / 20

Chapter 14 Evaluation

Chapter 14 Evaluation. Slides developed by Ronald W. Toseland State University of New York at Albany. Practitioner's Dilemma. Not enough time to conduct an evaluation No administrative support No access to work computers Limited experience in conducting an evaluation

acurci
Download Presentation

Chapter 14 Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Chapter 14Evaluation Slides developed by Ronald W. Toseland State University of New York at Albany

  2. Practitioner's Dilemma • Not enough time to conduct an evaluation • No administrative support • No access to work computers • Limited experience in conducting an evaluation • Partnerships with schools of social work can help

  3. Reasons for Conducting Evaluations • Benefits of Evaluations • Can satisfy worker’s curiosity and professional concerns about the effectiveness of their work • Can help evaluate new intervention methods • Can improve leadership skills • Demonstrate the usefulness of a group/group work service to agency/funding source, or community

  4. Reasons for Conducting Evaluations • Workers can assess the progress of group members and if the group is achieving its goals • Evaluations allow members and others to express their opinions about the group • At conferences and through newsletters and publications, workers can gather knowledge

  5. Reasons for Conducting Evaluations • Make overt the covert hypothesis-generating & hypothesis-testing processes they engage in • Evaluations can examine the cost-effectiveness of group work services

  6. Selecting Data Collection Instruments • Progress notes • Self-reports and observations • Questionnaires • Analysis of reports or other products of the group

  7. Selecting Data Collection Instruments • Review of audio and videotapes • Observational coding schemes • Role plays and performance tests • Reliable and valid scales

  8. Evaluation Methods • For planning a group • For monitoring a group • For developing a group • For testing the effectiveness and efficiency of a group

  9. Evaluations for Planning a Group • Ways to obtain program information • Examine records from previous groups • Attend workshops and conferences • Review relevant journals and books using computerized search procedures • Read the minutes of previous meetings • Read the bylaws of the sponsoring organization

  10. Evaluations for Planning a Group • Read any operating procedures that may exist from previous meetings of the task group • Be clear about the responsibilities of the group • Obtain information from other agencies about how similar objectives were accomplished • Attend meetings of groups with similar concerns • Conduct needs assessments

  11. Evaluations For Monitoring a Group • Decide what information to collect • For example the five axes of the DSM • Decide how to collect the data

  12. Monitoring • Monitoring by the member • Self-observations • Logs • Diaries

  13. The Self-monitoring Process • Objectives and goals of the agency • Objectives and goals of the worker • Objectives and goals of the members • Problem is defined

  14. The Self-monitoring Process • Worker and group members agree on the tasks to be performed • Data are collected on the tasks • Information is used at the next meeting

  15. Monitoring • Monitoring by the worker • Progress notes • Group recording forms • Session evaluation forms

  16. Evaluations for Developing a Group • Identify the need or problem • Gather and analyze relevant data • Develop a new group program or method • Evaluate the new program or method • Modify the program based on the data obtained • Repeat steps 4 and 5 until the program is successful

  17. Evaluations for Developing a Group • Single system designs • Before and after (AB) • Withdrawal (ABA) • Reversal (ABAB) • Multiple baseline • Changing Criterion Designs • Case study methods

  18. Evaluations for Determining Effectiveness and Efficiency • Goal attainment scaling • Pressing Problem Index • Experimental designs • Quasi-experimental designs – before and after designs with valid and reliable scales • Efficiency evaluations

  19. Choosing Measures • Choose the variables to be measured • Examine what data collection method is possible and convenient • Use reliable and valid measures when possible • Don’t construct your own measure if one is available • Use multiple measures whenever possible

  20. Types of Measures • Self-report measures • Rapid assessment measures • Observational measures • Products of group interaction

More Related