1 / 138

Online Assessment and Evaluation Techniques in Corporate Settings

Online Assessment and Evaluation Techniques in Corporate Settings. Dr. Curtis J. Bonk President, CourseShare.com Associate Professor, Indiana University http://php.indiana.edu/~cjbonk, cjbonk@indiana.edu. Workshop Overview. Part I: The State of Online Learning

gavrilla
Download Presentation

Online Assessment and Evaluation Techniques in Corporate Settings

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Online Assessment and Evaluation Techniques in Corporate Settings Dr. Curtis J. Bonk President, CourseShare.com Associate Professor, Indiana University http://php.indiana.edu/~cjbonk, cjbonk@indiana.edu

  2. Workshop Overview • Part I: The State of Online Learning • Part II. Evaluation Purposes, Approaches, and Frameworks • Part III. Applying Kirkpatrick’s 4 Levels • Part IV. ROI and Online Learning • Part V. Collecting Evaluation Data & Online Evaluation Tools

  3. Sevilla & Wells (July, 2001), e-learning We could be very productive by ignoring assessment altogether and assume competence if the learner simply gets through the course.

  4. Why Evaluate? • Cost-savings • Becoming less important reason to evaluate as more people recognize that the initial expense is balanced by long-term financial benefits • Performance improvement • A clear place to see impact of online learning • Competency advancement

  5. 1. Formative Evaluation 2. Summative Evaluation 3. CIPP Model Evaluation 4. Objectives-Oriented Evaluation 5. Marshall & Shriver's 5 Levels of Evaluation 6. Bonk’s 8 Part Evaluation Plan (& the Ridiculous Model) 7. Kirkpatrick’s 4 Levels 8. Return on Investment (ROI): 9. K-Level 6 budget and stability of e-learning team. 10. K-Level 7 whether e-learning champion(s) are promoted 11. Cost/Benefit Analysis (CBA) 12. Time to Competency 13. Time to Market 14. Return on Expectation 15. AEIOU: Accountability, Effectiveness, Impact, Organizational Context, U = Unintended Consequences 16. Consumer-Oriented Evaluation 16 Evaluation Methods

  6. Part I. The State of Online Learning

  7. Survey of 201 Trainers, Instructors, Managers, Instructional Designers, CEOs, CLOs, etc.

  8. Survey Limitations • Sample pool—e-PostDirect • The Web is changing rapidly • Lengthy survey, low response rate • No password or keycode • Many backgrounds—hard to generalize • Does not address all issues (e.g., ROI calculations, how trained & supported, specific assessments)

  9. Why Interested in E-Learning? • Mainly cost savings • Reduced travel time • Greater flexibility in delivery • Timeliness of training • Better allocation of resources, speed of delivery, convenience, course customization, lifelong learning options, personal growth, greater distrib of materials

  10. A Few Assessment Comments

  11. Level 1 Comments. Reactions “We assess our courses based on participation levels and online surveys after course completion. All of our courses are asynchronous.” “I conduct a post course survey of course material, delivery methods and mode, and instructor effectiveness. I look for suggestions and modify each course based on the results of the survey.” “We use the Halo Survey process of asking them when the course is concluding.”

  12. Level 2 Comments: Learning “We use online testing and simulation frequently for testing student knowledge.” “Do multiple choice exams after each section of the course.” “We use online exams and use level 2 evaluation forms.”

  13. Level 3 Comment: Job Performance “I feel strongly there is a need to measure the success of any training in terms of the implementation of the new behaviors on the job. Having said that, I find there is very limited by our clients in spending the dollars required…”

  14. More Assessment CommentsMultiple Level Evaluation “Using Level One Evaluations for each session followed by a summary evaluation. Thirty days post-training, conversations occur with learners’ managers to assess Level 2” (actually Level 3).” “We do Level 1 measurements to gauge student reactions to online training using an online evaluation form. We do Level 2 measurements to determine whether or not learning has occurred… “Currently, we are using online teaching and following up with manager assessments that the instructional material is being put to use on the job.”

  15. Who is Evaluating Online Learning? • 59% of respondents said they did not have a formal evaluation program • At Reaction level: 79% • At Learning level: 61% • At Behavior/Job Performance level: 47% • At Results or Return on Investment: 30%

  16. Assessment Lacking or Too Early “We are just beginning to use Web-based technology for education of both associates and customers, and do not have the metric to measure our success. However, we are putting together a focus group to determine what to measure (and) how.” “We have no online evaluation for students at this time.” “We lack useful tools in this area.”

  17. Limitations with Current System “I feel strongly there is a need to measure the success of any training in terms of the implementation of the new behaviors on the job. Having said that, I find there is very limited by our clients in spending the dollars required…” “We are looking for better ways to track learner progress, learner satisfaction, and retention of material.” “Have had fairly poor ratings on reliability, customer support, and interactivity…”

  18. Pause…How and What Do You Evaluate…?

  19. Readiness Checklist 1.___ Is your organization undergoing significant change, in part related to e-learning? 2.___ Is there pressure from senior management to measure the results of e-learning? 3.___ Has your company experienced one or more training/learning disasters in the past? 4.___ Is the image of the training/learning function lower than you want?

  20. Part II Evaluation Purposes, Approaches and Frameworks

  21. What is Evaluation??? “Simply put, an evaluation is concerned with judging the worth of a program and is essentially conducted to aid in the making of decisions by stakeholders.” (e.g., does it work as effectively as the standard instructional approach). (Champagne & Wisher, in press)

  22. What is assessment? • Assessment refers to…efforts to obtain info about how and what students are learning in order to improve…teaching efforts and/or to demo to others the degree to which students have accomplished the learning goals for a course.” (Millar, 2001, p. 11). • It is a way of using info obtained through various types of measurement to determine a learner’s performance or skill on some task or situation (Rosenkrans, 2000).

  23. Who are you evaluating for? The level of evaluation will depend on articulation of the stakeholders. Stakeholders of evaluation in corporate settings may range from…???

  24. Evaluation Purposes • Determine learner progress • What did they learn? • Document learning impact • How well do learners use what they learned? • How much do learners use what they learn?

  25. Evaluation Purposes • Efficiency • Was online learning more effective than another medium? • Was online learning more cost-effective than another medium/what was the return on investment (ROI)? • Improvement • How do we do this better?

  26. Evaluation Purposes “An evaluation plan can evaluate the delivery of e-learning, identify ways to improve the online delivery of it, and justify the investment in the online training package, program, or initiative.” (Champagne & Wisher, in press)

  27. Evaluation Plans Does your company have a training evaluation plan?

  28. Steps to Developing an OL Evaluation Program • Select a purpose and framework • Develop benchmarks • Develop online survey instruments • For learner reactions • For learner post-training performance • For manager post-training reactions • Develop data analysis and management plan

  29. 1. Formative Evaluation • Formative evaluations focus on improving the online learning experience. • A formative focus will try to find out what worked or did not work. • Formative evaluation is particularly useful for examining instructional design and instructor performance.

  30. Formative Questions • -How can we improve our OL program? • -How can we make our OL program more efficient? • -More effective? • -More accessible?

  31. 2. Summative Evaluation • Summative evaluations focus on the overall success of the OL experience (should it be continued?). • A summative focus will look at whether or not objectives are met, the training is cost-effective, etc.

  32. Course Completion • Jeanne Meister, Corporate University Xchange, found a 70 percent drop out rate compared to classroom rates of 15%. • Perhaps need new metrics. Need to see if they can test out. • “Almost any measure would be better than course completion, which is not a predictor of anything.” Tom Kelly, Cisco, March 2002, e-Learning.

  33. What Can OL Evaluation Measure? • Categories of Evaluation Info (Woodley and Kirkwood, 1986) • Measures of activity • Measures of efficiency • Measures of outcomes • Measures of program aims • Measures of policy • Measures of organizations

  34. Typical Evaluation Frameworks for OL • Commonly used frameworks include: • CIPP Model • Objectives-oriented • Marshall & Shriver’s 5 levels • Kirkpatrick’s 4 levels • Plus a 5th level • AEIOU • Consumer-oriented

  35. 3. CIPP Model Evaluation • CIPP is a management-oriented model • C = context • I = input • P = process • P = product • Examines the OL within its larger system/context

  36. CIPP & OL: Context • Context: Addresses the environment in which OL takes place. • How does the real environment compare to the ideal? • Uncovers systemic problems that may dampen OL success. • Technology breakdowns • Inadequate computer systems

  37. CIPP & OL: Input • Input: Examines what resources are put into OL. • Is the content right? • Have we used the right combination of media? • Uncovers instructional design issues.

  38. CIPP & OL: Process • Process: Examines how well the implementation works. • Did the course run smoothly? • Were there technology problems? • Was the facilitation and participation as planned? • Uncovers implementation issues.

  39. CIPP & OL: Product • Product: Addresses outcomes of the learning. • Did the learners learn? How do you know? • Does the online training have an effect on workflow or productivity? • Uncovers systemic problems.

  40. 4. Objectives-Oriented Evaluation • Examines OL training objectives as compared to training results • Helps determine if objectives are being met • Helps determine if objectives, as formally stated, are appropriate • Objectives can be used as a comparative benchmark between online and other training methods

  41. Evaluating Objectives & OL • An objectives-oriented approach can examine two levels of objectives: • Instructional objectives for learners (did the learners learn?) • Systemic objectives for training (did the training solve the problem?)

  42. Objectives & OL • Requires: • A clear sense of what the objectives are (always a good idea anyway) • The ability to measure whether or not objectives are met • Some objectives may be implicit and hard to state • Some objectives are not easy to measure

  43. 5. Marshall & Shriver's Five Levels of Evaluation • Performance-based evaluation framework • Each level examines a different area’s of performance • Requires demonstration of learning

  44. Marshall & Shriver's 5 Levels • Level I: Self (instructor) • Level II: Course Materials • Level II: Course Curriculum • Level IV: Course Modules • Level V: Learning Transfer

  45. 6. Bonk’s Evaluation Plan…

More Related