Review alternative assessments i
This presentation is the property of its rightful owner.
Sponsored Links
1 / 23

Review: Alternative Assessments I PowerPoint PPT Presentation


  • 88 Views
  • Uploaded on
  • Presentation posted in: General

Review: Alternative Assessments I. Describe the two epistemologies in ch. 3 (o/s) Compare the two principles for assigning value (util/int-pl) Identify pros/cons of the two evaluation approaches we discussed last week. Alternative Approaches to Evaluation II. Dr. Suzan Ayers

Download Presentation

Review: Alternative Assessments I

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Review alternative assessments i

Review: Alternative Assessments I

  • Describe the two epistemologies in ch. 3 (o/s)

  • Compare the two principles for assigning value (util/int-pl)

  • Identify pros/cons of the two evaluation approaches we discussed last week


Alternative approaches to evaluation ii

Alternative Approaches to Evaluation II

Dr. Suzan Ayers

Western Michigan University

(courtesy of Dr. Mary Schutten)


Consumer oriented approach

Consumer-Oriented Approach

  • Typically a summative evaluation approach

  • This approach advocates consumer education and independent reviews of products

  • Scriven’s contributions based on groundswell of federally funded educational programs in 1960s

    • Differentiation between formative/summative eval.


Consumer oriented checklist scriven 1974 p 102

Consumer-Oriented Checklist(Scriven, 1974, p. 102)

  • Need

  • Market

  • Performance

    • True field trials [tests in a “real” setting]

    • True consumer tests [tests with real users]

    • Critical comparisons [comparative data]

    • Long term [effects over the long term]

    • Side effects [unintended outcomes]

    • Process [product use fits its descriptions]

    • Causation [experimental study]

    • Statistical significance [supports product effectiveness]

    • Educational significance


Review alternative assessments i

  • Cost effectiveness

  • Extended support [in service training]

    Producer’s efforts to meet these standards improve product effectiveness

  • Key Evaluation Checklist developed to evaluate program evaluations

  • Educational Products Information Exchange(EPIE): Independent product-reviewer service

  • Curriculum Materials Analysis System (CMAS) checklist: Describe product, analyze rationale, consider: antecedent conditions, content, instructional theory & teaching strategies, form overall judgments


Uses of consumer oriented evaluation approach

Uses of Consumer-OrientedEvaluation Approach

  • Typically used by gov’t. agencies and consumer advocates (i.e., EPIE)

  • What does one need to know about a product before deciding whether to adopt or install it?

    • Process information

    • Content information

    • Transportability information

    • Effectiveness information


Consumer oriented pros cons

Consumer-Oriented Pros/Cons

  • Strengths: valuable info given to those who don’t have time to study, advance consumers’ knowledge of appropriate criteria for selection of programs/products

  • Weaknesses: can increase product cost, stringent testing may “crimp” creativity, local initiative lessened b/c of dependency on outside consumer services


Consumer oriented qs

Consumer-Oriented Qs

  • What educational products do you use?

  • How are purchasing decisions made?

  • What criteria seem to most important in the selection process?

  • What other criteria for selection does this approach suggest to you?


Expertise oriented approach

Expertise-Oriented Approach

  • Depends primarily upon professional expertise to judge an institution, program, product, or activity

  • This is the first view that relies heavily on subjective expertise as the key evaluation tool

  • Examples: doctoral exams, board reviews, accreditation, reappointment/tenure reviews etc…


Expertise oriented types

Expertise-Oriented Types

  • Formal Review Systems (accreditation)

    • Existing structure, standards exist, set review schedule, experts, status usually affected by results

  • Informal Review systems (grad S committee)

    • Existing structure, no standards, infrequent schedule, experts, status usually affected

  • Ad hoc panel review (journal reviews)

    • Multiple opinions, status sometimes affected

  • Ad hoc individual review (consultant)

    • Status sometimes affected


Expertise oriented pros cons

Expertise-Oriented Pros/Cons

  • Strengths: those well-versed make decisions, standards are set, encourage improvement through self-study

  • Weaknesses: whose standards? (personal bias), expertise credentials, can this approach be used with issues of classroom life, texts, and other evaluation objects or only with the bigger institutional questions?


Expertise oriented qs

Expertise-Oriented Qs

  • What outsiders review your program or organization?

  • How expert are they in your program’s context, process, and outcomes?

  • What are characteristics of the most/least helpful reviewers? (list brainstorms on board)


Participant oriented approach

Participant-Oriented Approach

  • Heretofore, the human element was missing from program evaluation

  • This approach involves all relevant interests in the evaluation

  • This approach encourages support for representation of marginalized, oppressed and/or powerless parties


Participant oriented characteristics

Participant-Oriented Characteristics

  • Depend in inductive reasoning [observe, discover, understand]

  • Use multiple data sources [subjective, objective, quant, qual]

  • Do not follow a standard plan [process evolves as participants gain experience in the activity]

  • Record multiple rather than single realities [e.g., focus groups]


Participant oriented examples

Participant-Oriented Examples

  • Stake’s Countenance Framework

    • Description and judgment

  • Responsive Evaluation

    • Addressing stakeholders’ concerns/issues

    • Case studies describe participants’ behaviors

  • Naturalistic Evaluation

    • Extensive observations, interviews, documents and unobtrusive measures serve as both data and reporting techniques

    • Credibility vs. internal validity (x-checking, triangulation)

    • Applicability vs. external validity (thick descriptions)

    • Auditability vs. reliability (consistency of results)

    • Confirmability vs. objectivity (neutrality of evaluation)


Review alternative assessments i

  • Participatory Evaluation

    • Collaboration between evaluators & key organiz-ational personnel for practical problem solving

  • Utilization-Focused Evaluation

    • Base all decisions on how everything will affect use

  • Empowerment Evaluation

    • Advocates for societies’ disenfranchised, voiceless minorities

    • Advantages: training, facilitation, advocacy, illumination, liberation

    • Unclear how this approach is a unique participant-oriented approach

    • Argued in evaluation that it is not even ‘evaluation’


Participant oriented pros cons

Participant-Oriented Pros/Cons

  • Strengths: emphasizes human element, gain new insights and theories, flexibility, attention to contextual variables, encourages multiple data collection methods, provides rich, persuasive information, establishes dialogue with and empowers quiet, powerless stakeholders

  • Weaknesses: too complex for practitioners (more for theorists), political element, subjective, “loose” evaluations, labor intensive which limits number of cases studied, cost, potential for evaluators to lose objectivity


Participant oriented qs

Participant-Oriented Qs

  • What current program are you involved in that could benefit from this type of evaluation?

  • Who are the stakeholders?


Alternative approaches summary

Alternative Approaches Summary

Five cautions about collective evaluation conceptions presented so far

1) Writings in evaluation are not models/theories

  • Evaluation is a transdiscipline (not yet a distinct discipline)

  • “Theoretical” underpinnings in evaluation lack important characteristics of most theories

  • Information shared is: sets of categories, lists of things to think about, descriptions, etc.


Review alternative assessments i

2) “Discipleship” to a single ‘model’ is dangerous

  • Use of different approaches as heuristic tools, each appropriate for the situation, recommended

    3) Calls to consolidate evaluation approaches into a single model are unwise

  • These efforts based in attempts to simplify evaluation

  • Approaches are based on widely divergent philosophical assumptions

  • Development of a single omnibus model would prematurely close a divergent phase in the field

  • Just because we can does not mean we should; would evaluation be enriched by synthesizing the multitude of approaches into a few guidelines?


Review alternative assessments i

4) The choice of an evaluation approach is not empirically based

  • Single most important impediment to development of more adequate theory and models in evaluation

    5) Negative metaphors underlying some approaches can cause negative side effects

  • Metaphors shared in ch. 3 are predicated on negative assumptions in two categories:

    • Tacitly assume something is wrong in system being evaluated (short-sighted indictment)

    • Based on assumptions that people will lie, evade Qs or withhold information as a matter of course


Alternative approaches contributions

Alternative Approaches’ Contributions

Approaches shared in ch. 4-8 influence evaluation practices in important ways

  • Help evaluators think diversely

  • Present & provoke new ideas/techniques

  • Serve as mental checklists of things to consider, remember, or worry about

  • Alternative approaches’ heuristic value is very high, but their prescriptive value is less so

  • Avoid mixing evaluation’s philosophically incompatible ‘oil/water’ approaches; eclectic use of alternative approaches can be advantageous to high-quality evaluation practices

    Table 9.1


Exercise

Exercise

  • Clearly identify your evaluand

    • Is it a program, policy, product, service, other?

    • Who does it (or should it) serve?

    • Who is in charge of it?

  • Find a partner and explain what you have written

    • Does it make sense?

    • Does it match what you wrote?

    • Does it avoid specifying criteria?

    • Is it simple enough?

    • Did you avoid commenting on the merits of the evaluand?


  • Login