1 / 21

Evaluating Collaboration

Evaluating Collaboration . National Extension Family Life Specialists Conference April 28, 2005 Ellen Taylor-Powell, Ph.D. Evaluation Specialist University of Wisconsin-Extension. Types of evaluation questions. The outcomes are broad and complex. How do we get started?

karli
Download Presentation

Evaluating Collaboration

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluating Collaboration National Extension Family Life Specialists Conference April 28, 2005 Ellen Taylor-Powell, Ph.D. Evaluation Specialist University of Wisconsin-Extension

  2. Types of evaluation questions • The outcomes are broad and complex. How do we get started? • Is evaluating process good enough or do we have to evaluate outcomes? • Who should be involved in evaluating a collaborative program? • I’m not in charge. How do I evaluate it? • How do I take credit for something that we’ve done together?

  3. Issues and challenges • Power – control • Process of the evaluation • Data • Standards and quality of the evaluation • Cross-cultural issues • Measurements issues • Attribution • Taking credit

  4. Collaborative evaluation(not evaluation of collaboration) • Since mid – 1970’s, new paradigm of participatory evaluation “applied social research that involves trained evaluation personnel…and practice-based decision makers working in partnership” (Cousins and Earl, 1992) • Multiple approaches -from broadening decision making (practical) to emancipation and social change (transformation) • Emphasis on using data collection and feedback to strengthen and monitor collaboration and thus increase overall effectiveness and efficiency • value in the process of evaluation, process use (Patton, 1997), as much as product

  5. Who controls? Who participates? How much? Researcher control Consultation Primary users Deep participation All legitimate groups Practitioner/participant control Adapted from Cousins and Whitmore, 1998

  6. First… • Who wants to know what? • For what purpose? • How will information be used?

  7. INPUTS OUTPUTS OUTCOMES Collaborative Product Collaborative Relationship Assumptions External factors Building a logic model of collaboration SITUATION Collaborative Effectiveness EVALUATION

  8. Collaboration: Theory of change Change in Knowldge Attitudes Skills Motivation Intent Self-efficacy Change in behaviors Partners Implement activities – action plan Monitor and evaluate Communicate • Clientele • Users • Policy makers • Publics Policy changes Funding Changes in conditions Key stake holders System changes Value-added Research-based Community changes Advocacy/Policy Capacity building -TA • Change in • KAS • Self-efficacy • Intent • Change in • behaviors • decision making • Effective functioning partnership • Member satisfaction Collaborative Relationship building • Individual members • Group WHAT DO YOU WANT TO KNOW?

  9. Evaluating the Collaborative Relationship 1. Process evaluation • How is it functioning? How effective is the group work? Are we likely to achieve our desired results? How satisfied are members? • Questions about capacities, operations, climate, context • Factors influencing success • Projected tasks/activities relative to stages of development • Milestones and Critical Events (journey)

  10. MILESTONES • Significant points along the way • Examples • Key stakeholders on board • Vision statement established • Grant secured • Action plan formulated – plan of work • Project implemented/service provided • Project evaluated CRITICAL EVENTS • Unexpected events, positive and negative • Progress markers • Evidence of accomplishments • Disruptions or obstacles • Examples • Change in membership • Policy change • New donor added

  11. 2. Outcomes (Process outcomes): • What difference has being a part of this group made for the individual? • Knowledge, skills, motivations, behaviors, etc. • Human capital development • What difference is their for the group? • Group functioning, identify, resource pooling, etc Note: Outcomes can be positive, negative or neutral

  12. Methods • Informal feedback • Member (partner) Survey • Member (partner) interviews • Group discussions • Key informant interviews • Observation • Identification and use of indicators • Network analysis ; sociogram • Use existing materials (integrate into ongoing operations) • Minutes of meetings • Logs: telephone, event, registration forms • Management charts WHEN? Periodic Review Points of particular concern

  13. Tools - Techniques • Community Group Member Survey • Collaborative Relationship scales • Internal collaborative functioning scales • Plan Quality Index • Meeting effectiveness inventories • Stage of readiness • On-line Wilder Collaboration Factors Inventory (Amherst H. Wilder Foundation) • On-line Partnership self-assessment tool (Center for Advancement of Collaborative Strategies in Health)

  14. Evaluating Programs/Products created/implemented by the collaboration • Process or implementation evaluation (Focus: program delivery vs. coordination or support role) • How is program being implemented? Fidelity to plan? Extent of delivery? Participation? What is/has happened that wasn’t planned?

  15. Outcome evaluation • What is different? For whom? How? To what extent? • For: Individuals, Groups/Families, Agencies, Systems, Communities • Changes in …

  16. Change in :

  17. Tools - Techniques • Monitor implementation • Logs, management charts, • Interviews • Observations • Achievement of outcomes • Clientele surveys • Clientele interviews • Observations • Mixed Methods

  18. Evaluating self - Taking credit • Mutual (reciprocal) accountability • How do I take credit for my part? How does Extension gain visibility, recognition? • What is your contribution? What role did you play? What value did you bring? • Document role you play, your activities and contributions, inputs you bring, resources you make available, niche, value…

  19. Your contribution • Log of activities, roles played • Record inputs, resources contributed • Management chart; analysis of minutes • Independent assessment • Survey • Interviews

  20. Your (partner) performance: • Most important indicator: other partners’ satisfaction with your performance (Brinkerhoff, 2002) • Mutual assessment among partners of each partner’s performance. Resulting discussion re. Discrepancies = powerful information sharing and trust building. (We aren’t very good at this type of thing)

  21. Web address • http://www.uwex.edu/ces/pdande

More Related