1 / 53

Initiative/Portfolio Evaluation: A Four-Step Model

Initiative/Portfolio Evaluation: A Four-Step Model. Huilan Yang, Ph.D. New Frontiers in Evaluation Vienna, Austria April 25, 2006. Overview. Overview of Evaluation at the W.K. Kellogg Foundation What is an initiative? What are the characteristics of an initiative evaluation?

mairwen
Download Presentation

Initiative/Portfolio Evaluation: A Four-Step Model

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Initiative/Portfolio Evaluation:A Four-Step Model Huilan Yang, Ph.D. New Frontiers in Evaluation Vienna, Austria April 25, 2006

  2. Overview • Overview of Evaluation at the W.K. Kellogg Foundation • What is an initiative? • What are the characteristics of an initiative evaluation? • Description of the Four-Step Model • What are the caveats of using this model? • Does the model work? • Some practical implications • Some after-thoughts and reflections

  3. DEMOCRATIC You have two cows. Your neighbor has none. You feel guilty for being successful. Barbara Streisand sings for you. REPUBLICAN You have two cows. Your neighbor has none. So? COMMUNIST You have two cows. The government seizes both and provides you with milk. You wait in line for hours to get it. It is expensive and sour. CAPITALIST, AMERICAN STYLE You have two cows. You sell one, buy a bull, and build a herd of cows. Some Humor

  4. Part I Overview of Evaluation at the Kellogg Foundation

  5. The W.K. Kellogg Foundation • $7.3B assets, established in 1930 • Independent entity from the Kellogg Co. • Annual grantmaking amount $200M+ in US, Latin America & the Caribbean, & Southern Africa • Five domestic programming areas • Food System & Rural Development, Philanthropy & Volunteerism, Health, Youth & Education, local Battle Creek hometown • Six Impact Services • Evaluation, Communication, Policy, Technology, Meeting Services, Learning

  6. Geographic Location Battle Creek

  7. Historical Context • Cluster evaluation started in late 1980’s • Overall purpose: learn from and determine impact of national social change work • Common issue, but local variation encouraged • Over time – many definitions, applied in many different types of programs

  8. Cluster Evaluation and Multi-Site Evaluation

  9. Along Came Initiatives… • Late 90’s – WKKF and other foundations increasing focused on systems change • WKKF distinguished between • CLUSTERS -- exploratory, designed to learn about new field of work, 3 – 5 year funding) • INITIATIVES – systems change, driven by theory, developed in stages, funding for up to 10 years

  10. How Does Funding Strategy Influence Evaluation?

  11. Influence…

  12. Influence…

  13. Part II What is an Initiative?

  14. Define “Initiatives” • At the Kellogg Foundation, an initiative is defined as a strategically designed systemic change effort: • with clear outcomes and long-term, sustainable impact • with an intensive focus on sustainability • requiring a clear theory of change and/or logic model • target system and population clearly defined • involving multiple projects

  15. Part III Initiative Evaluation Characteristics

  16. Initiative Evaluation* • Uses multiple sites (ranges from 5 to 30) that vary in implementation approaches and/or outputs; • Assumes that situations are dynamic, changing, and continuously evolving; • Is conducted in a way that is philosophically congruent with the change theory of the initiative; • Includes a focus on learning from the different strategies and contexts of the funded projects; • Obtains multiple perspectives throughout data collection, analysis, synthesis, interpretation, communication, and use; • Uses networking conferences or other gatherings of the initiative’s project teams for group analysis of initiative evaluation data and reflection on its meaning. * The document from which these are cited is not yet published by WKKF. So please do not reference or quote.

  17. Part IV The Four Step Model

  18. AMERICAN CORPORATION You have two cows. You sell one, lease it back to yourself and do an IPO on the 2nd one. You force the two cows to produce the milk of four cows. You are surprised when one cow drops dead. You spin an announcement to the analysts stating you have down-sized and are reducing expenses. Your stock goes up. FRENCH CORPORATION You have two cows. You go on strike because you want three cows. You go to lunch and drink wine. Life is good. ITALIAN CORPORATION You have two cows but you don't know where they are. While ambling around, you see a beautiful woman. You break for lunch. Life is good. GERMAN CORPORATION You have two cows. You engineer them so they are all blonde, drink lots of beer, give excellent quality milk, and run a hundred miles an hour. Unfortunately they also demand 13 weeks of vacation per year. Some Humor

  19. Content: What is Being Evaluated? • Philanthropy is defined as the giving of time, money, and know-how to advance the common good. • Philanthropy and volunteerism grantmaking at the W.K. Kellogg Foundation is aimed at increasing the ranks of new givers and nurturing emerging forms of philanthropy. Through the program activities, we seek to unleash resources by supporting the emergence of new leaders and donors, creating and sharing knowledge, and building tools that advance the effectiveness of the philanthropic sector.

  20. Unleashing Resources Initiative

  21. Programming Talk Initiative Clusters Projects

  22. Evaluation Talk Cluster Evaluation Initiative Evaluation Project Evaluation

  23. Initiative Evaluation Means… • Establish common high level goals or outcomes • Coordinate data collection methods, data sources, process, and/or instruments • Identify major stakeholder needs • Build productive relationships among evaluators at different levels

  24. Unleashing Resources Initiative Evaluation • Evaluation is conducted at all three levels • Initiative level • Cluster level • Project level • Nested, multi-layed, multi-dimensional nature • Project: its contribution to cluster & initiative success, plus its unique aspects • Cluster: its contribution to initiative success, its interdepenency with other clusters, plus its unique aspects • Initiative: overall initiative success, unique contributions from projects, clusters, plus learning about strategies and informing future grantmaking

  25. Without Alignment…. • Evaluation efforts at different levels will become fragmented in many different directions, making the evaluation simply a collection of independent projects at the best. • Duplicated data collection will waste human and financial resources. • Some data sources might be tapped into more than once for information, thus adding tremendously burden and causing loss of good will.

  26. Birth of the Four-Step-Model • Alignment of all levels of evaluation • Exhaustive data sources • Efficient, yet effective data collection • Minimize duplicated efforts • Meet all stakeholders’ needs: project, project constituents and WKKF • Best use of exiting data: annual reports, evaluation reports, networking meetings, etc.

  27. The Four Step Model

  28. Initiative Evaluation

  29. The Logic Model as the Guide & Foundation for Evaluation • A logic model is a systematic and visual way to present and share your understanding of the relationships among the resources you have to operate your program, the activities you plan to do, and the changes or results you hope to achieve (W.K. Kellogg Foundation, 2001). • www.wkkf.org -> knowledgebase -> toolkits -> evaluation -> Logic Model Development Guide

  30. LOGIC MODEL for Unleashing Resources Initiative Building Mutually Responsible & Just Society in which All Have the Ability & Will to Contribute to the Common Good

  31. Initiative Level Cluster Level Evaluation Questions Project Level The Unleashing Resources Initiative Leaders _________ Knowledge ________ Tools Grantees in the Leaders Cluster ___________ Grantees in the Knowledge Cluster ___________ Grantees in the Tools Cluster 1. To what extent are the primary groups (including women, youth, and community of color) becoming more visible and engaged in the field of philanthropy? 2. What level of access do the primary groups have to real opportunities for social change and impacting the facilitating groups? 3. In what ways do the primary groups engage with and influence the mainstream philanthropy? 4. Does the networking of the six population groups succeed in promoting sharing, cooperation and collaboration? 5. Does the networking of the six population groups succeed in promoting sharing, cooperation and collaboration? ____________________________________________ 6. How accessible, useful and applicable is the knowledge generated to practice in philanthropy? ____________________________________________ 7. How accessible, useful and applicable are the tools developed to practice in philanthropy? 8. Are innovative ways of giving determined and advanced? From Logic Model to Evaluation Questions

  32. From Evaluation Questions to Indicators • Evaluation questions are too abstract. • To create a common ground for all grantees, clusters and initiative evaluators, indicators are developed. • Indicators serving as working definitions for each evaluation question • Evaluation question = outcome • Two types of indicators • Quantitative • Qualitative

  33. From Indicators to Evaluation Sub-questions • For each indicator, an evaluation sub-question is generated to guide data collection. • Sub-questions need to be: • Directly based on the indicators • Different sets of questions for grantees in different clusters to triangulate data • Blueprint for instrument development • Cooperation and collaboration across all evaluation efforts • Balance between qualitative and quantitative information

  34. Aligning Data Collection and Analysis Efforts with Aligned Evaluation Sub-Questions • Make evaluation sub-questions available to evaluators at different levels to: • Help the evaluators understand the interrelations among the sub-questions • Provide useful guidance to evaluators as to how to collect and analyze data to address each sub-question and, when needed, where to find additional help

  35. To Sum Up • Operationalize TOC/LM – making sense of the theoretical framework to assess initiative’s evaluability • Turn outcomes into overarching evaluation questions • For each question, identify a list of feasible and desired indicators • Generate evaluation sub-questions from indicators identified • Clusters and projects “pick” indicators and use evaluation sub-questions as a guide to collect data • Initiative evaluators • Aggregate data from cluster/project evaluations • “Fill the blank” with its own data collection

  36. A Successful Case* • Tools Cluster Evaluation • Logic Model was developed using the URI LM as a guide, keeping in mind the needs of the cluster • Guidebook developed by URI evaluators is referenced when identifying data points • “It (the Guidebook) saves me a lot of time trying to figure out what I need to do to ‘make Kellogg happy’ in terms of data needs…..In addition, I know the data I collect will have dual purposes.” * Handouts of Tools Cluster Logic Model and Evaluation Plan chart

  37. Once upon a time, there was a prince…. • …..the crystal slipper fits her perfectly. So Cinderella got onto the horse-drawn carriage and was driven to the Palace, where the prince proposed to her. She accepted the proposal. They got married and a grand ball was given to celebrate the wedding. • And they lived happily ever after. The End

  38. Interrelationship Among Expected URI Outcomes

  39. Four Sets of Evaluation Questions • To what extent have the outcomes 1, 2, 4, and 5 been achieved as a direct result of URI approaches? We consider this as evaluation of immediate URI outcomes. • How has the achievement of outcomes 1, 2, 4, and 5 helped promote the recognition of the traditions of giving among the three primary groups in society? And how has the achievement of outcomes 1, 2, 4, and 5 promoted innovative ways of giving (outcome 3)? We consider these as evaluation of intermediate URI outcomes. • Have such recognition and promotion improved the practice of mainstream giving? If yes, in what ways has the practice of mainstream giving been influenced by the primary groups (outcome 6)? And has giving been increased among the six population groups as a result of recognition and promotion (outcome 7)? We consider these as evaluation of long-term URI outcomes. • What social changes of WKKF interests are advanced by URI? We consider this as evaluation of URI impact.

  40. For Example….

  41. Indicator List • Representation on boards and staff • # of attendance at critical sector meetings (Substantive contribution on panels) • Contracts/Partnerships profile • Types of research, content focus, perspectives provided/sought after • Number of fellowships awarded to primary groups (e.g. ARNOVA) • Ways research on primary groups’ engagement was made visible (e.g. ARNOVA) • Degree of influencing visibility of research on primary groups’ engagement (things are done differently) [ripple effect] • Topics, themes of conferences, seminars • Profile of new givers not there before • More culturally relevant tools to engage donors of color • Perception of the population • Do people in communities of color feel engaged? • Do they think themselves as actors/givers? • Amount given ($, time, know-how) by primary groups • More structures supporting giving of new funds, etc. – women, youth and communities of color (are there more?) • New/additional structures for giving that are unique & appropriate to cultures • New practices emerged in giving • Innovations in giving (what are they?) • Changes in ways of working in grantee organizations

  42. Strengthened leaders and organizations • Increased access to decision-making • Improved access to and control of time, money, and know-how • New solution and innovations in giving • Supported collective and representative leadership • Linked infrastructure organizations and support systems • Support leaders and innovators • Create and share knowledge • Develop tools for effective giving • Value, honor, celebrate, promote, and connect traditional & new approaches to giving • Push networks and support systems to be more effective Philanthropy & Volunteerism Theory of Change Vision: A mutually responsible and just society in which all have the ability and will to give time, money and know-how to the common good Goal: Increased engagement in social change by youth, women, and people of color for effective and innovative impact Outcomes Program Strategy

  43. Part V Reflections

  44. Words of Caution • Initiative evolution  evaluation modification • Stay flexible and open-minded, looking for unintended results • Always keep eyes on the ball, transcend trivial stuff • Intent: not establish cause-&-effect relationship • Evaluation: improving, plausible relationship between what is done and what is achieved • Research: proving, with emphasis on making judgments

  45. Revelations • Traditional thinking & approaches of evaluation are not appropriate for developmental, evolving, or organic initiatives (where parameters are not clearly defined). • When initiatives come into existence based largely on program staff’s intuition, evidence-based evaluation no longer serves the purpose. • Formative evaluation may not be alternative, either. • Timing is off, when TOC might be changed; thus data collected based on “original” evaluation design becomes outdated. • In resource-rich and time-poor environments, there exist competing/conflicting priorities for evaluation: • Accountability (outcomes), Improvement (process), Learning (for what?)

  46. After Thoughts • Portfolio/initiative evaluation & foundation (institution) evaluation • $30B/year in grantmaking by US foundations • Requires $5M/year to assess foundation effectiveness, yet not happening • What can we do to “transfer” experiences in evaluating portfolios/initiatives to evaluating institutions?

  47. Questions? Comments? Or Applause? hy1@wkkf.org 269-968-1611 (phone) Thank you!

More Related