Eval 6000 foundations of evaluation
This presentation is the property of its rightful owner.
Sponsored Links
1 / 32

EVAL 6000: Foundations of Evaluation PowerPoint PPT Presentation

  • Uploaded on
  • Presentation posted in: General

EVAL 6000: Foundations of Evaluation. Dr. Chris L. S. Coryn Kristin A. Hobson Fall 2012. Agenda. Stage Three theories Peter Rossi Use-oriented theories and theorists Utilization-focused evaluation Michael Patton Participatory Evaluation Brad Cousins Questions and discussion

Download Presentation

EVAL 6000: Foundations of Evaluation

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript

Eval 6000 foundations of evaluation

EVAL 6000: Foundations of Evaluation

Dr. Chris L. S. Coryn

Kristin A. Hobson

Fall 2012



  • Stage Three theories

    • Peter Rossi

  • Use-oriented theories and theorists

    • Utilization-focused evaluation

    • Michael Patton

    • Participatory Evaluation

    • Brad Cousins

  • Questions and discussion

  • Encyclopedia of Evaluation entries

Eval 6000 foundations of evaluation

“Evaluation research is more that the application of methods…it is also a political and managerial activity, an input into…policy decisions and allocations”

— Peter H. Rossi

Biographical sketch

Biographical Sketch

  • Born in 1921 in New York City

  • Ph.D. in Sociology, Columbia University

  • B.S. in Sociology, City College

  • Professor Emeritus of Sociology at University of Massachusetts and held positions as Harvard, University of Chicago, and Johns Hopkins University

  • Published numerous books, research monographs and articles

  • Led many high-stakes national-level evaluations

Rossi s view of evaluation

Rossi’s View of Evaluation

  • Influenced by Campbell, Cronbach, and Scriven

  • Major function of social research in public policy formulation and change is to evaluate the effectiveness of public programs

  • Emphasis on empirically testing social theories as part of program evaluation

Rossi s influence

Rossi’s Influence

  • Extensive and diverse

  • Sociological (e.g., books on life histories of American families)

  • Methodological (e.g., survey research)

  • Primarily evaluation theory and methodology

Rossi s major contributions

Rossi’s Major Contributions

  • Tailored evaluation

  • Comprehensive evaluation

  • Theory-driven evaluation

  • Demystification

  • The “good enough” rule

  • The metallic and plastic laws of evaluation

Rossi s theory of social programming

Rossi’s Theory of Social Programming

  • Social interventions are conservative and incremental

  • Central task is to design programs that serve the disadvantaged well

  • Recognizes the political and economic constraints placed on social programs

Rossi s theory of knowledge construction

Rossi’s Theory of Knowledge Construction

  • Both realist and empiricists in orientation

  • Simultaneously emphasizes fallibilism and multiplism

  • Questions the philosophical warrants for a singular epistemology, and questions the legitimacy and value of epistemology more generally

Rossi s theory of valuing

Rossi’s Theory of Valuing

  • Similar to Scriven in many respects

  • Social need is a crucial criterion for value claims

  • Integrates both prescriptive and descriptive theories (though never clear in explication of how to integrate)

Rossi s theory of knowledge use

Rossi’s Theory of Knowledge Use

  • Distinguishes between instrumental, conceptual, and persuasive uses

  • Not clear about contingencies to guide choices to facilitate types of use

  • Demystification (e.g., the nature of social problems and their amelioration) has been criticized for being too “scientistic”

Rossi s theory of evaluation practice

Rossi’s Theory of Evaluation Practice

  • Clearly describes trade-offs and priorities depending on various circumstances (e.g., innovations, modifications, established programs)

  • Recognizes constraints associated with trade-offs and priorities (e.g., comprehensive versus tailored evaluations)

  • See Table 9.1, p. 383

Evaluation theory tree

Evaluation Theory Tree

Use oriented theorists

Use-Oriented Theorists




Eval 6000 foundations of evaluation

“This class of theories [use] are concerned with designing evaluation that are intended to decision making…to ensure that evaluation results have a direct impact on decision making and organizational change”

— Marvin C. Alkin

Use oriented theories

Use-Oriented Theories

  • Originated from decision-oriented theories

  • Decision-oriented theorists emphasize evaluation as assisting key decision makers in making informed decisions

  • Evaluations should be designed to ensure direct impact on decision making and organizational change

Eval 6000 foundations of evaluation

“Evaluations should be judged by their utility and actual use…[and]… evaluators should facilitate the evaluation process and design any evaluation with careful consideration of how everything that is done, from beginning to end, will affect use”

— Michael Q. Patton

Utilization focused evaluation

Utilization-Focused Evaluation

  • Explicitly geared to ensure that evaluations make an impact and are used

  • Evaluation is guided in collaboration with a targeted group of priority users

Utilization focused evaluation1

Utilization-Focused Evaluation

  • All aspects are chosen and applied to help targeted users obtain and apply evaluation findings to their intended use and maximize the likelihood that they will

  • In the interest of getting findings used, draws on any legitimate evaluation approach

Situational analysis

Situational Analysis

  • What decisions, if any, are the evaluation findings expected to influence?

  • When will decisions be made? By whom? When, then, must the evaluation findings be presented to be timely and influential?

  • What is at stake in the decisions? For whom? What controversies or issues surround the decision?

  • What is the history and context of the decision-making process?

  • What other factors (values, politics, personalities, promises already made) will affect the decision making?

Situational analysis1

Situational Analysis

  • How much influence do you expect the evaluation to have—realistically?

  • To what extent has the outcome of the decision already been determined?

  • What data and findings are needed to support decision making?

  • What needs to be done to achieve that level of influence?

  • How will we know afterward if the evaluation was used as intended?

Eval 6000 foundations of evaluation

“[Practical participatory evaluation]…seeks to understand program with the expressed intention of informing and improving their implementation”

— J. Bradley Cousins

Participatory evaluation

Participatory Evaluation

  • Evaluator works collaboratively in partnership with a select group of intended users

  • The evaluator’s role is to provide technical support, training, and to assure and maintain quality control

  • Involves a broad group of stakeholder participants

Participatory evaluation1

Participatory Evaluation

  • Modified from more limited stakeholder-based aproaches

  • Stakeholders are engaged in the entire evaluation process (e.g., design, data collection, analysis, reporting, application of findings)

  • Assumes that involvement will increase buy-in, credibility, and use

Eval 6000 foundations of evaluation

“[The CIPP model encourages evaluators to engage a]…representative stakeholder review panel to help define the evaluation questions, shape evaluation plans, review draft reports and disseminate findings”

— Daniel L. Stufflebeam

Improvement and accountability oriented approaches

Improvement- and Accountability-Oriented Approaches

  • Expansive and seek comprehensiveness in considering the full range of questions and criteria needed to assess a program

  • Often employ the assessed needs of a program’s stakeholders as the foundational criteria for assessing a program

Improvement and accountability oriented approaches1

Improvement- and Accountability-Oriented Approaches

  • They usually reference all pertinent technical and economic criteria for judging the merit or quality of programs

  • Examine all relevant outcomes, not just those keyed to program objectives

  • Use multiple qualitative and quantitative assessment methods to provide cross-checks on findings

Decision and accountability oriented studies

Decision- and Accountability-Oriented Studies

  • Emphasizes that program evaluation should be used proactively to help improve a program as well as retrospectively to judge value

  • Philosophical underpinnings include an objectivist orientation to finding best answers to context-limited questions and subscription to the principles of a well-functioning democratic society, especially human rights, an enlightened citizenry, equity, excellence, conservation, probity, and accountability

Decision and accountability oriented studies1

Decision- and Accountability-Oriented Studies

  • Serves stakeholders by engaging them in focusing an evaluation and assessing draft evaluation reports; addressing their most important questions plus those required to assess the program’s value; providing timely, relevant information to assist decision making; producing an accountability record; and issuing needed summative evaluation reports

  • This approach is best represented by Stufflebeam’s context, input, process, and product (CIPP) model for evaluation

Cipp model

CIPP Model

Cipp model1

CIPP Model

Encyclopedia entries

Encyclopedia Entries

  • CIPP Model (Context, Input, Process, Product)

  • Cost-Benefit Analysis

  • Cost-Effectiveness

  • Goal

  • Indicators

  • Meta-Analysis

  • Monitoring

  • Needs Assessment

  • Objectives

  • Objectives-Based Evaluation

  • Outcomes

  • Outputs

  • Success Case Method

  • Tyler, Ralph W.

  • Login