1 / 24

Summative Evaluation

Summative Evaluation. The Evaluation after implementation Dick and Carey: Chapter 12. Involves _______ data. Collecting Analyzing Summarizing. For the purpose of. Giving decision makers information on the effectiveness and efficiency of instruction.

due
Download Presentation

Summative Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Summative Evaluation The Evaluation after implementation Dick and Carey: Chapter 12

  2. Involves _______ data Collecting Analyzing Summarizing

  3. For the purpose of Giving decision makers information on the effectiveness and efficiency of instruction

  4. Effectiveness of Content (should be criterion referenced) • Does the instruction solve the problem? • Was the criterion established prior to evaluation? • Was the criterion established in conjunction with the needs assessment?

  5. Specifically • Did the learners achieve the objectives? • How did the learners feel about the instruction? • What were the costs? • How much time did it take? • Was the instruction implemented as designed? • What unexpected outcomes were there?

  6. Alternative Approaches to Summative Evaluation Objectivism Subjectivism

  7. Objectivism • Based on empiricism • Answering questions on the bases of observed data • Goal based and replicable, uses the scientific method

  8. Subjectivism • Employs expert judgment • Includes qualitative methods • observation and interviews (evaluate content) • “Goal Free” • evaluators haven’t a clue about the goals

  9. Objectivism (limitations) • Examine only a limited number of factors • May miss critical effects

  10. Subjectivism (limitations) • Are not replicable • Biased by idiosyncratic experiences, perspectives, or the people who do the evaluation • May miss critical effects

  11. Designer Role in Summative Evaluation? Somewhat controversial

  12. Timing of Summative Evaluation? Not in the first cycle

  13. Formative Design Reviews Expert Reviews One-to-one Eval. Small Group Eval. Field Trials Ongoing Eval. Summative Determine Goals of the Evaluation Select Orientation Select Design Design or Select Evaluation Measures Collect Data Analyze Data Report Results Summary Diagram

  14. Determine Goals of the Evaluation • What decisions must be made as a result of the evaluation? • What questions will best answer these questions? • How practical is it to gather data to answer a question? • Who wants the answer to a question? • How much uncertaintyis associated with the answer?

  15. Select Orientation of Evaluation • Goal-based or goal-free • A middle ground? • Quantitative or qualitative appropriate? • Experimental or naturalistic approach?

  16. Select Design of Evaluation • Describes what data to collect • When the data will be collected • And under what conditions • Issues to consider: • How much confidence must we have that the instruction caused the learning? (internal validity) • How important is the generalizability? (external validity) • How much control do we have over the instructional situation?

  17. Design or Select Evaluation Measures • Payoff outcomes • Is the problem solved? • Costs avoided • Increased outputs • Improved quality • Improved efficiency

  18. Design or Select Evaluation Measures (2) • Learning Outcomes • Use instrument you’ve already developed for the summative evaluation • But measure the entire program

  19. Design or Select Evaluation Measures (3) • Attitudes • Rarely the primary payoff goals • Ask about learner attitudes toward • learning • instructional materials • subject matter • Indices of appeal • attention, likeableness, interest, relevance, familiarity, credibility, acceptability, and excitement

  20. Design or Select Evaluation Measures (4) • Level of Implementation • degree to which the instruction was implemented • Costs • Cost-feasibility • Cost-effectiveness

  21. Alternative Designs • Instruction then posttest • Pretest then instruction then posttest

  22. The Report • Summary • Background • Needs assessment, audience, context, program description • Description of evaluation study • Purpose of evaluation, evaluation of the design, outcomes measured, implementation measures, cost-effectiveness info., analysis of unintentional outcomes

  23. The Report (continued) • Results • Outcomes, implementation, cost-effectiveness info., unintentional outcomes • Discussion • causal relationship between program & results • Limitation of study • Conclusion & Recommendations

  24. Summary • Summative evaluation is after implementation • Limitations of subjective and objective evaluation • What to include in the report

More Related