1 / 26

Summative Evaluation

Summative Evaluation. The Evaluation after implementation Instructional Design: Evaluation Phase. Outcomes: by the end of this session you should be able to:. Describe the purpose of summative evaluation Design summative evaluation and decisions resulting from each phase

lmcginnis
Download Presentation

Summative Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Summative Evaluation The Evaluation after implementation Instructional Design: Evaluation Phase

  2. Outcomes: by the end of this session you should be able to: • Describe the purpose of summative evaluation • Design summative evaluation and decisions resulting from each phase • Design summative evaluation for comparing alternatives sets of candidates instructional materials • Contrasts formative and summative evaluation by purpose and design

  3. Dick &Carey Design Model

  4. Involves _______ data Collecting Analyzing Summarizing

  5. For the purpose of Giving decision makers information on the effectiveness and efficiency of instruction

  6. Effectiveness of Content (should be criterion referenced) • Does the instruction solve the problem? • Was the criterion established prior to evaluation? • Was the criterion established in conjunction with the needs assessment?

  7. Specifically • Did the learners achieve the objectives? • How did the learners feel about the instruction? • What were the costs? • How much time did it take? • Was the instruction implemented as designed? • What unexpected outcomes were there?

  8. Alternative Approaches to Summative Evaluation Objectivism Subjectivism

  9. Objectivism • Based on empiricism • Answering questions on the bases of observed data • Goal based and replicable, uses the scientific method

  10. Subjectivism • Employs expert judgment • Includes qualitative methods • observation and interviews (evaluate content) • “Goal Free” • evaluators haven’t a clue about the goals

  11. Objectivism (limitations) • Examine only a limited number of factors • May miss critical effects

  12. Subjectivism (limitations) • Are not replicable • Biased by idiosyncratic experiences, perspectives, or the people who do the evaluation • May miss critical effects

  13. Designer Role in Summative Evaluation? Somewhat controversial

  14. Timing of Summative Evaluation? Not in the first cycle

  15. Formative Design Reviews Expert Reviews One-to-one Eval. Small Group Eval. Field Trials Ongoing Eval. Summative Determine Goals of the Evaluation Select Orientation Select Design Design or Select Evaluation Measures Collect Data Analyze Data Report Results Summary Diagram

  16. Determine Goals of the Evaluation • What decisions must be made as a result of the evaluation? • What questions will best answer these questions? • How practical is it to gather data to answer a question? • Who wants the answer to a question? • How much uncertaintyis associated with the answer?

  17. Select Orientation of Evaluation • Goal-based or goal-free • A middle ground? • Quantitative or qualitative appropriate? • Experimental or naturalistic approach?

  18. Select Design of Evaluation • Describes what data to collect • When the data will be collected • And under what conditions • Issues to be consider: • How much confidence must we have that the instruction caused the learning? (internal validity) • How important is the generalizability? (external validity) • How much control do we have over the instructional situation?

  19. Design or Select Evaluation Measures • Payoff outcomes • Is the problem solved? • Costs avoided • Increased outputs • Improved quality • Improved efficiency

  20. Design or Select Evaluation Measures (2) • Learning Outcomes • Use instrument you’ve already developed for the summative evaluation • But measure the entire program

  21. Design or Select Evaluation Measures (3) • Attitudes • Rarely the primary payoff goals • Ask about learner attitudes toward • learning • instructional materials • subject matter • Indices of appeal • attention, likeableness, interest, relevance, familiarity, credibility, acceptability, and excitement

  22. Design or Select Evaluation Measures (4) • Level of Implementation • degree to which the instruction was implemented • Costs • Cost-feasibility • Cost-effectiveness

  23. Alternative Designs • Instruction then posttest • Pretest then instruction then posttest

  24. The Report • Summary • Background • Needs assessment, audience, context, program description • Description of evaluation study • Purpose of evaluation, evaluation of the design, outcomes measured, implementation measures, cost-effectiveness info., analysis of unintentional outcomes

  25. The Report (continued) • Results • Outcomes, implementation, cost-effectiveness info., unintentional outcomes • Discussion • causal relationship between program & results • Limitation of study • Conclusion & Recommendations

  26. Summary • Summative evaluation is after implementation • Limitations of subjective and objective evaluation • What to include in the report

More Related