1 / 18

Meeting an Evaluation Challenge: Identifying and Overcoming Methodological Problems

Meeting an Evaluation Challenge: Identifying and Overcoming Methodological Problems. Joint CEA/AEA Evaluation 2005 “Crossing Borders, Crossing Boundaries” RTD TIG, Think Tank Session Toronto, Canada October 26, 2005. Connie K.N. Chang Supervisory Economist

arlene
Download Presentation

Meeting an Evaluation Challenge: Identifying and Overcoming Methodological Problems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Meeting an Evaluation Challenge: Identifying and Overcoming Methodological Problems Joint CEA/AEA Evaluation 2005 “Crossing Borders, Crossing Boundaries” RTD TIG, Think Tank Session Toronto, Canada October 26, 2005 Connie K.N. Chang Supervisory Economist Advanced Technology Program, NIST U.S. Department of Commerce connie.chang@nist.gov Rosalie T. Ruegg Managing Director TIA Consulting, Inc. ruegg@ec.rr.com

  2. The 3rd in a series of Think Tanks on barriers to evaluation • 2003: Identification of 6 Categories of Barriers to Evaluation • 2004: Focus on Institution and Cultural Barriers—Feedback Loops • 2005: Focus on Methodological Barriers 2005 CES/AEA Evaluation Conference Think Tank Ruegg and Chang

  3. Review of ‘03 Think Tank Main Finding Striking commonality of evaluation barriers among programs and across countries 2005 CES/AEA Evaluation Conference Think Tank Ruegg and Chang

  4. Review of ‘03 Think Tank, cont’d • Six categories of barriers identified • Institutional/cultural • Methodological • Resources • Communications • Measurement/data • Conflicting stakeholder agendas 2005 CES/AEA Evaluation Conference Think Tank Ruegg and Chang

  5. Review of ‘03 Think Tank, cont’d • Barriers were said to impede: • Demand for evaluation • Planning and conducting evaluation • Understanding of evaluation studies • Acceptance and interpretation of findings • Use of results to inform • program management • budgetary decisions • public policy 2005 CES/AEA Evaluation Conference Think Tank Ruegg and Chang

  6. 2005 Think Tank • Six Categories of Barriers Identified • Institutional/cultural • Methodological – today’s focus • Resources • Communications • Measurement/data • Conflicting stakeholder agendas 2005 CES/AEA Evaluation Conference Think Tank Ruegg and Chang

  7. Methodological issues identified in 2003 • Problems in measuring the value of knowledge creation • Lack of standardization leading to comparability problems • Inability to replicate studies • Difficulties in apportioning benefits to complex contributing factors 2005 CES/AEA Evaluation Conference Think Tank Ruegg and Chang

  8. Methodological issues identified in 2003, cont’d • Difficulty in defining success against multiple program goals • Selecting the appropriate method for a given purpose • Reliability and acceptance of new methods • Adherence of studies to best practices 2005 CES/AEA Evaluation Conference Think Tank Ruegg and Chang

  9. Additional methodological issues? Potential new issues to consider: • Problems in measuring commercialization and its economic value • What are best practices in methodology? • What constitutes market failure or market imperfection? • Question of additionality effect of public funding • What is the state-of-the-art of evaluating R&D portfolios and collections of R&D portfolios? 2005 CES/AEA Evaluation Conference Think Tank Ruegg and Chang

  10. Methodological issues identified – today’s focus • Problems in measuring the value of knowledge creation • Lack of standardization leading to comparability problems • Inability to replicate studies • Apportioning benefits to complex contributing factors • Defining success against multiple program goals • Selecting the appropriate methods for a given use • Reliability and acceptance of new methods • Adherence of studies to best practices 2005 CES/AEA Evaluation Conference Think Tank Ruegg and Chang

  11. Measuring value of knowledge creation How do we define value? • By quality or significance of knowledge created? • By economic value of the knowledge created? 2005 CES/AEA Evaluation Conference Think Tank Ruegg and Chang

  12. Measuring value of knowledge creation, cont’d Methodology depends on the definition of value: • For value in terms of quality, use bibliometrics (counts—with adjustment for quality variation among journals, citation analysis, content analysis); research value mapping; network analysis; third-party recognition through awards for scientific merit • For value in terms of economics, follow multiple paths of knowledge flow to downstream users, using indicator metrics, historical tracing supported by citation analysis, then, benefit-cost analysis with “snow-ball” technique 2005 CES/AEA Evaluation Conference Think Tank Ruegg and Chang

  13. Lack of standardization leading to comparability problems Standardization to whose standards? • By program practitioner? By research field? By country? Standardization of what methodological terms? • Performance indicators? Rate of return on investment? Net present values? Social vs. public rate of return? Underlying Assumptions? Other? How do we define comparability? • Across studies? Within a program? Across programs? Across national borders? 2005 CES/AEA Evaluation Conference Think Tank Ruegg and Chang

  14. Lack of standardization leading to comparability problems, cont’d Some ways to promote standardization for improved comparability: • Providing a standard format and guidelines for similar studies (e.g., NAS/NRC matrix and guidance for evaluating DOE EE & FE R&D projects) • Commissioning clusters of studies to be performed within a common framework (e.g., ATP’s cluster studies of component-based software projects, tissue engineering projects, and composite manufacturing technologies) • Other? 2005 CES/AEA Evaluation Conference Think Tank Ruegg and Chang

  15. Ability to replicate studies • What do we mean by ability to replicate? • Simple meaning: Like scientists, allow others to run the experiment using the same data to see if the same results are obtained • Another meaning: Re-do studies, testing for robustness by varying key data and assumptions in sensitivity analysis, and possibly refining the approach • Yet another meaning: Confirm prospective benefit estimates with later retrospective cost-benefit studies 2005 CES/AEA Evaluation Conference Think Tank Ruegg and Chang

  16. Ability to replicate studies, cont’d • Key is to avoid “black box” • Transparency and clarity of approach are essential 2005 CES/AEA Evaluation Conference Think Tank Ruegg and Chang

  17. Summary • Six Categories of Barriers Identified -- 2003 • Institutional/cultural -- 2004 • Methodological -- 2005 • Resources • Communications • Measurement/data • Conflicting stakeholder agendas 2005 CES/AEA Evaluation Conference Think Tank Ruegg and Chang

  18. Rosalie T. Ruegg Managing Director TIA Consulting, Inc. ruegg@ec.rr.com Connie K.N. Chang Supervisory Economist Advanced Technology Program connie.chang@nist.gov www.atp.nist.gov Contact information 2005 CES/AEA Evaluation Conference Think Tank Ruegg and Chang

More Related