1 / 29

BEDI-II

BEDI-II. Measuring what we value rather than valuing what we measure January 21, 2010 Dr. Terri Helmlinger Ratcliff, IES Jeffrey Debellis, SBTDC Raj Narayan, Kenan Institute for ETS. BEDI. Benchmarking Economic Development Impacts Task force began in January 2007 January 2008 report.

gent
Download Presentation

BEDI-II

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. BEDI-II Measuring what we value rather than valuing what we measure January 21, 2010 Dr. Terri Helmlinger Ratcliff, IES Jeffrey Debellis, SBTDC Raj Narayan, Kenan Institute for ETS

  2. BEDI • Benchmarking Economic Development Impacts • Task force began in January 2007 • January 2008 report

  3. INPUTS ACTIVITIES OUTPUTS SHORT-TERM OUTCOMES * LONG-TERM OUTCOMES IMPACT Economic Infrastructure Resources Empowerment QOL Logic Model • Adapted & focus-grouped • Looked beyond expenditure effects to "knowledge effects"

  4. Knowledge Effects & Measurements • Logic models apply to each category • "On what impacts would you like to be measured?" RESEARCH TEACHING Tech Transfer & Commercialize Clinical & Testing Services Knowledge Creation/Xfer Technical & Expert Assist SERVICE Classes & Programs University & Industry Research INPUTS Co-Curricular Service Public Events & Understanding ACTIVITIES OUTPUTS OUTCOMES IMPACTS

  5. BEDI-II • BEDI-II expanded the scope campus-wide • Task force began in January 2009 • January 2010 report

  6. One Important Conclusion • Not always easy to agree on definitions, but common language → shared understanding • INPUTS: resources directed toward the work • ACTIVITIES: the intentional part of a program • OUTPUTS: • Direct products of program activities • Or, the "turnstile" that counts the number of widgets or people coming out of the activity

  7. Developing Shared Understanding • OUTCOMES: changes in participants' behavior • Usually take time to result • Key Idea: Likely best measured from the point of view of the beneficiary (whether an individual, organization, or community), not the provider • IMPACTS: • Intended or unintended changes to the system • Broadly sweeping changes in monetized economic conditions, natural resources, infrastructure, empowerment, or quality of life

  8. BEDI-II Process, Findings, & Recommendations

  9. BEDI-II Process Analysis • How does your unit's program(s) make NC a better place? • How do you know? • What is the process for gathering information about your programs impacts? • How do you inform others about the impact of what you do? • What would it take or what would you need to help improve your ability to collect and report on the impact of your program(s)? • c. 20 Interviews with leaders of CILs • Animal and Poultry Waste Management Center • Center for Transportation & the Environment • Small Business & Technology Development Center • Biomanufacturing Training & Education Center (BTEC) • Center for Urban Affairs & Community Services • Institute for Emerging Issues • CALS Cooperative Extension • Friday Institute for Educational Innovation • Water Resources Research Institute • Institute for Transportation Research & Education • NC Solar Center • Veterinary Medical Services, CVM • NC Space Grant • NC Sea Grant • Center for Marine Sciences & Technology

  10. Process Analysis Findings • No systematic process for measuring impact across the university • Measurements typically tracked outputs such as number of attendees, number of classes, etc. • Impact information tracked for national contracts or grants tended to coincide with funding agencies' impact tracking • Short term success stories varied in their dissemination • Many long-term quantifiable impacts are not captured • Program leaders reported lack of time, expertise, resources to track impacts  • Some units requested assistance in developing evaluation instruments

  11. Nine Pilot Projects • Pilot projects conducted with nine units • Developed comprehensive logic models tailored to their needs • Outcome ranges: • Short-term • Medium-term • Long-term Tree Improvement Program ENCORE Non-Woven Institute TEC Plant and Animal Disease Clinic Economic Development Partnership The Science House Small Animal Emergency Clinic State Climate Office

  12. BEDI-II Recommendations 1. Institute a process by which we can account for engagement accomplishments, program outcomes, and societal impacts, from the point of view of recipients/beneficiaries 2. Administrative units should use the BEDI framework of terminology and metrics for consistency across campus • Need for common language & shared understanding

  13. BEDI-II Recommendations 3a. Adopt the BEDI framework as part of NCSU annual reporting, aligning research and outreach activities and impacts with the university's goals and focus areas 3b. As UNC-Tomorrow progresses, make the BEDI framework and logic model available to other universities in the system

  14. BEDI-II Recommendations 4. Encourage departments, faculty, and staff to use -- and provide feedback according to -- the logic model: encourage reporting outcomes of engagement activities (and, if known, impacts) within a reasonable time after project completion 5. Develop and fund a centralized evaluation support office to: form a critical mass of expertise to apply the logic model, help units develop customized models and measurements, and conduct surveys of recipients/beneficiaries

  15. BEDI-II Evaluation Tool

  16. What Can It Do For You? • Collect objective data to support anecdotes about department, unit, college, and university contributions • Provide comprehensive sets of data for annual reporting • Include stratified data related to individual projects . . . especially useful for grants and contracts

  17. Tool Development • Based on highly successful economic impact tool used by Industrial Extension Service to comply with National Institute for Standards & Technology requirements • Recognized throughout Mfg Extension Partnership • Strictly based on monetized impacts • Expanded the tool to include non-monetized impacts in several categories • Eventual version will be web-accessible

  18. Measurements, by Impact Domain • Economic ("Monetized Impacts") • Jobs • Bottom-line growth • Improved Infrastructure and Built Resources • Improvement in ASCE report card • Enhanced Natural Resources • Reduction in carbon footprint • Quality of Life • Quality of Life Index

  19. Measurements, by Impact Domain • Human and Social Empowerment • More educated citizens • More involved citizens • More engaged citizens • Impact on public policy • Improved leadership capacity

  20. Measurement Techniques • Maps link impacts, etc., to measurements Output 1 Measure 1 Output 2 Measure 2 Conferences Outcome 1 Measure 3 Technical Assistance Outcome 2 Measure 4 Training Outcome 3 Measure 5 Research Outcome 4 Measure 6 Impact Measure 7

  21. Reconciling Data Needs • Emphasis on data that are really needed: "have to have" vs. "nice to have" • Important since data collection timescales can vary widely • Faculty data collection cycles start over by semester or academic year • Research data collection cycles can vary based on funding source & reporting requirements

  22. Tool Use • Pilot projects validated approach • Using logic model framework, develop potential measurements for inputs, activities, outputs, outcomes, and impact • Identified measurements become questions in evaluation tool • Throughout program, refine quantified answers in tool • Generate reports as needed

  23. Example: Lifelong Learning

  24. Example: Small Animal Emergency

  25. Key Element: Flexibility • Not a "one size fits all" approach • Not the hammer, by which every problem looks like a nail • Each department, unit, and college can tailor the tool (i.e., the questions) to match the impact measures they identify

  26. BEDI-II Implementation Strategies

  27. Okay, Great -- Now What? • Best way to adopt the recommendations? • "Evolutionary" change -- slowly across the university? • Thorough and orderly, but takes a lot of time • Or "revolutionary" change -- faculty, staff, and departments apply the changes on their own, near simultaneously? • May be faster, but harder to start and direct • Timely, orderly fashion needs direction • Dual approach best, if possible

  28. Implementing A Cultural Change • Short-, Mid-, & Long-Term Approaches • Spread knowledge of logic models and their use • Deploy consistent, standardized output measures • Search and apply for grants for implementation • Require outcome reporting from beneficiary POV • Resources for units to develop customized models • Identify options for measurement across campus • Simple, consistently used collection system • Central office for university-wide measurements

  29. BEDI-II Small Groups Session

More Related