1 / 65

EDIT 6170 Instructional Design June 27, 2003

EDIT 6170 Instructional Design June 27, 2003 Dr. Lloyd Rieber, Michael Orey, Michele Estes, University of Georgia, Department of Instructional Technology If you can hear me, click If you cannot hear me, click And refer to Online Survival Guide for Help 4 Topics Mid-semester review

Download Presentation

EDIT 6170 Instructional Design June 27, 2003

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. EDIT 6170Instructional DesignJune 27, 2003 Dr. Lloyd Rieber, Michael Orey, Michele Estes, University of Georgia, Department of Instructional Technology If you can hear me, click If you cannot hear me, click And refer to Online Survival Guide for Help

  2. 4 Topics • Mid-semester review • Brief Review of Key Topics • Team responsibilities • More examples of instructional development (if there is time at end) • Formative evaluation • Review • Revising instructional materials • Summative evaluation

  3. HorizonLive is Available • Feel free to use your break out room for team meetings. • You also have a WebCT chat room, bulletin board and you are free to use phones and other meeting places to meet. You do need to meet and we provide several options for holding team meetings.

  4. Updates & Reminders • Design Teams • WebCT topic areas created for each team • Weekend time on HL? • WWILD Team (submit 5, review 5) due today • Check out teams’ progress reports #2

  5. Progress Report #3Due June 30 (Monday) • You need to have completed the first draft of your needs assessment, course design, & unit design • Provide the first draft of your progress report as a link to the Word document off of your team page • I assume someone on the team has the skill to upload the file and knows how to determine the URL.

  6. Mid Semester Course Review In class: Polls and discussion Online Survey about Team formation

  7. Polls • Daily class organization • IDAs • Giving buddy feedback • Getting buddy feedback • The way teams were formed • Course support for team work • Prepared to do team project • Self-directed media-based learning experience • WWILD Team • HL Classroom • WebCT • Communication between instructor/students • Dick, Carey, & Carey textbook

  8. Open Discussion • Daily class organization • IDAs • Giving buddy feedback • Getting buddy feedback • The way teams were formed • Course support for team work • Prepared to do team project • Self-directed media-based learning experience • WWILD Team • HL Classroom • WebCT • Communication between instructor/students • Dick, Carey, & Carey textbook • Other?

  9. Things to keep | Ways to Improve • Daily class organization • IDAs • Giving buddy feedback • Getting buddy feedback • The way teams were formed • Course support for team work • Prepared to do team project • Self-directed media-based learning experience • WWILD Team • HL Classroom • WebCT • Communication between instructor/students • Dick, Carey, & Carey textbook • Other

  10. Debriefing of… • WWILD Team • Remember, you don’t have to “reinvent the wheel”; it’s OK to use material from the Internet • Think of these as “learning objects” (very hot topic right now in eLearning circles) • Of course, be sure to adhere to copyright laws.

  11. Brief Review… Designing and Conducting Formative Evaluations

  12. Revise Instruction Conduct Instructional Analysis Assess Need to Identify Goal(s) Write Performance Objectives Develop Assessment Instruments Develop Instructional Strategy Develop And Select Instructional Materials Design and Conduct Formative Evaluation Analyze Learners and Contexts Design and Conduct Summative Evaluation (Dick & Carey’s Model)

  13. The Concepts Of Formative Evaluation Definition The collection of data and information during the development of instruction that can be used to improve the effectiveness of the instruction. Purpose To obtain data that can be used to revise the instruction to make it more efficient and effective.

  14. Formative Evaluation Helps toAnswer the Following Questions • How Effective Is This Instruction at This Stage of Development? • What Has Been Learned? • How Usable Is The Instruction? • How Easy Is It For Students To Use The Media I’ve Developed? • How Motivational Is The Instruction? • In What Ways Can It Be Improved? • Improvement Is The Goal Of Formative Evaluation. After All, Your Instruction Is At A Very “Formative” Period, Is It Not?

  15. What Data Should I Collect? • Be very open to collecting any data that will help you answer the questions on the previous slide. • Don’t be defensive as a designer – expect improvements to be needed. • The sooner you begin the evaluation process, the less costly will be the revisions. • Imagine trying to persuade the most skeptical person about your lesson’s effectiveness • Be your own worst critic

  16. Evaluation and ResearchUse Similar Methods • A Variety Of Data:Quantitative And Qualitative • Triangulation: Do All Data PointTo The Same Interpretations? • Quantitative: Based On Numbers • Carefully Designed InstrumentsThat Can Be Scored • Qualitative: Based On Words • YOU Are The Instrument! • Careful Observation • More Focus On “Why” Questions

  17. The Role Of Subject-Matter,Learning, And Learner Specialists It’s important to have the instruction reviewed by specialists. SMEmay be able to comment on the accuracy and currency of the instruction. Learning specialist may be able to critique your instruction related to what is known about enhancing that particular type of learning Learner specialist may be able to provide insights into the appropriateness of the material for the eventual performance context.

  18. The Three PhasesOf Formative Evaluation • One-to-One Evaluation • Small-Group Evaluation • Field Trial

  19. One-to-One Evaluation Purpose To identify and remove the most obvious errors in the instruction To obtain initial performance indications and reactions to the content by learners Criteria • Clarity • Impact • Feasibility

  20. Small-Group Evaluation Purposes • To determine the effectiveness of changes made following the one-to-one evaluation. • To identify any remaining learning problems that learners may have. • To determine whether learners can use the instruction without interacting with the instructor.

  21. Field Trial Purpose To determine whether the changes/revisions in the instruction made after the small group stage were effective. To see whether the instruction can be used in the context for which it was intended.

  22. Constructing Good Assessments • Remember, a well-written objective ALMOST IS the assessment.

  23. Assessing Physics Understanding Pretend there is no friction or gravity. If a ball is moving to the right and its acceleration is also to the right, which of the following is true? • The ball’s speed is not changing. • The ball’s speed is increasing. • The ball’s speed is decreasing. • The ball’s speed increases at first, and then decreases. • None of the above are true.

  24. Assessing Physics Understanding If the speedometer needle of a car moved at a steady rate from the 30 mph mark to the 40 mph mark over a stretch of flat, straight road, which of the following is true? • Acceleration was nonzero in the opposite direction the car was moving. • Acceleration was 0. • Acceleration was nonzero in the direction the car was moving. • Acceleration was nonzero, but decreasing. • Acceleration was nonzero and increasing.

  25. Assessing Physics Understanding C B D E A Imagine that you threw a ball up into the air and it just left your hand at point A. Describe the motion of the ball and all the forces acting on it at each point.

  26. Assessing Physics Understanding

  27. Formative vs. Summative Evaluation • The purpose of formative evaluation is to improve instruction by getting data for revisions. • The purpose of summative evaluation is to prove the worth of the instruction, given that it will not revised.

  28. Summative Evaluation Is Similar To…

  29. Summative Evaluation Is Similar To…

  30. Questions? • Go ahead and enter question in message field, or… • Click and wait for my prompt to speak.

  31. Responsibility of Each Team • Course Design needs to be complete • Unit Design: Just choose one unit to design fully • Lesson design scope modification: How many lessons to design, develop, and field test? • Original: Design as many lessons as there are team members • Revised to… • 5-6 team members: 3 lessons • 4 or less team members: test 2 lessons

  32. Responsibility of Each Team • Identify lesson objective(s). • Prepare assessment instruments. • Consider both quantitative and qualitative methods/instruments • Check evaluation instruments for validity (i.e. are they congruent with objectives?) and reliability. • Consider both performance and motivation in your evaluation. • Be open to collect any other data that will serve to improve your instruction (including observation and learner introspection). • Prepare lesson using Instructional Strategy Planning Guide as a job aid. • Each lesson must be evaluated with at least 3 students in the target audience. • Interpret your formative evaluation based on all assessment instruments and observations. • Report the results in your final report.

  33. Questions? • Go ahead and enter question in message field, or… • Click and wait for my prompt to speak.

  34. Revising Instructional Materials

  35. Revise Instruction Conduct Instructional Analysis Assess Need to Identify Goal(s) Write Performance Objectives Develop Assessment Instruments Develop Instructional Strategy Develop And Select Instructional Materials Design and Conduct Formative Evaluation Analyze Learners and Contexts Design and Conduct Summative Evaluation (Dick & Carey’s Model)

  36. This Time… • Summarizing and analyzing data obtained from formative evaluation • Revising materials

  37. Two Basic Types of Revision • The changes that are made to the content of the materials • The changes that are related to the procedures employed in using the materials

  38. Do We Need To Make Revisions? • The changes that are made to the content of the materials - NO • The changes that are related to the procedures employed in using the materials – YES, within practical limits

  39. QuickPoll: SE Questions forSummative Evaluation

  40. Kinds Of Data To Analyze • Learner characteristics • Entry behavior • Direct responses to the instruction • Learning time • Posttest performance • Responses to an attitude questionnaire • Comments made directly in the materials

  41. Analyzing Data from One-to-One Trials The designer must look at the similarities and differences among the responses of the learners, and determine the best changes to make in the instruction.

  42. Analyzing Data from One-to-One Trials Three Sources Of Suggestions For Changes • Learner suggestions • Learner performance • Your own reactions to the instruction

  43. Analyzing Data from Small Groupand Field Trials The fundamental unit of analysis for all the assessments is the individual assessment item. Performance on each item must be scored as correct or incorrect.

  44. Analyzing Data from Small Groupand Field Trials Methods For Summarizing Data • Item-by-objective performance • Graphing learners’ performance • Descriptive fashion

  45. Analyzing Data from Small Groupand Field Trials Another Method For Summarizing Data • Comments can be captured in one-on-one charts where you list out comments made by each learner

  46. Analyzing Data from Small Group and Field Trials • Another Method For Summarizing Data • Assessment scores can be shown in charts or hierarchies that represent your individual objectives

  47. Analyzing Data from Small Group and Field Trials • Another Method for Summarizing Data • Results from attitude surveys can be placed in an attitude table.

  48. QuickPoll: SE The Process

  49. Revising Materials Use the data, your experience, and sound learning principles as the bases for your revision.

  50. Revising Selected Materials • Omit portions of the instruction. • Include other available materials. • Simply develop supplementary instruction.

More Related