1 / 38

Online Assessment for Individualized Distributed Learning Applications

Online Assessment for Individualized Distributed Learning Applications. Greg Chung. UCLA Graduate School of Education & Information Studies National Center for Research on Evaluation, Standards, and Student Testing (CRESST) Annual CRESST Conference Los Angeles, CA September 9, 2004.

onslow
Download Presentation

Online Assessment for Individualized Distributed Learning Applications

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Online Assessment for Individualized Distributed Learning Applications Greg Chung UCLA Graduate School of Education & Information StudiesNational Center for Research on Evaluation,Standards, and Student Testing (CRESST) Annual CRESST Conference Los Angeles, CASeptember 9, 2004

  2. Overview of Talk • Distributed learning (DL) context • Elements of a DL system • Research examples • Current work

  3. Distributed Learning Definition The distribution via technology of training, education, and information that resides at one location to any number of learners who may be separated by time and space and who may interact with other parties (peers, instructor, system) synchronously or asynchronously.

  4. Characteristics • Learner-centric • Autonomous learner • Asynchronous communication modes • Varying degree of instructor support

  5. Typical Vision Statement “Provide quality instruction to the right people, at the right time, and at the right place.”

  6. Implications of DL • Operational • Anytime, anywhere learning implies anytime, anywhere assessment • Online, rapid scoring, immediate feedback to learner, actionable information • Individualized • Research • Examine ways of extracting useful information about learners in an online context

  7. Instruction-AssessmentLoop Instruction Assessment Decision

  8. Elements of a DL System • Framework to guide what information to extract from the online environment • Method to synthesize disparate information types • Automated reasoning support for interpreting knowledge and performance observations

  9. CRESST Assessment Model ContentKnowledge Communication Problem Solving Learning Self-Regulation Collaboration

  10. Data Fusion Strategy • Inferential • used the “generate-and-test” problem-solving strategy • used productive learning strategies • understood the fundamentals of rifle marksmanship construct • Descriptive • adjusted bicycle pump design • performed (virtual) blood test correctly indicator • • • • Event • clicked on button 32 • selected test item 2 • spent 20 sec on help page 3 clickstream • • •

  11. Research Examples • Elements of DL system tested in several studies • Pump simulation design task • Tested whether the “generate-and-test” problem solving strategy could be measured using simple aggregation of clickstream data • Problem-solving task (IMMEX) • Tested whether moment-to-moment learning processes could be measured from clickstream data (data fused with Bayesian networks)

  12. Research Examples • Elements of DL system tested in several studies (continued) • Knowledge of rifle marksmanship • Tested individualized instruction based on measures of knowledge • Data fused with Bayesian networks

  13. Research Example 1: Pump Design Task • Can the “generate-and-test” problem solving strategy be measured using clickstream data? • Novel GUI to support measurement

  14. Generate-and-Test Processes

  15. Information events -- Click and hold mouse down to view information Design events – run pump simulation Solve problem event – commit to a design solution Design events – change dimensions of pump

  16. Example 1 Results

  17. Theory Online Behavior

  18. Example 1 Conclusion • Findings consistent with generate-and-test problem solving strategy • Sequence of events was an important characteristic of the data • Simple test of data fusion strategy • Insertion of software sensors driven by cognitive demands of task • Low-value clicks transformed into meaningful information

  19. Research Example 2:Problem-solving task • Research Question • To what extent can learning processes be modeled solely from clickstream (i.e., behavioral) data? • More complex test of data fusion strategy in a different domain • Use Bayesian networks to depict dependencies between cognitive processes and online behavior

  20. Test procedures Parents

  21. Behavioral Indicator Example • Construct: “Understands a test procedure” • Indicator: Not testing for a parent that could have been eliminated with a prior test • Indicator: Successive reduction in the number of parents tested across tests • Construct: “Successful learning” • Indicator: test -> library access of test -> test • Indicator: library access of test -> test -> library access of test • Indicator: 5s or more spent on library access of test -> test

  22. Bayesian Network Inferred processes Behavioral indicators

  23. Example 2 Results • Overall, similar pattern of results between BN and think-aloud measures with respect to: • Task performance measures • High vs. low performers • Scientific reasoning

  24. Example 2 Conclusion • More complex test of data fusion strategy • Descriptive measures derived from clickstream data • Low complexity, low inference -- easy to program in software • Inferences drawn from Bayesian network at level that is meaningful for instruction or assessment purposes • Low-value clicks transformed into meaningful information

  25. Research Example 3:Knowledge ofrifle marksmanship • How can information from assessments be used to deliver individualized instructional recommendations in a distributed learning (DL) context?

  26. Linking Assessment and Instruction Bayesian Network Model of Knowledge Dependencies Ontology of Marksmanship Domain probability of knowing a topic item-level scores content Recommender individualized feedback and content

  27. Example of Feedback and Content Delivery

  28. Example 3 Results • BN probabilities increased for concepts that had instructional content served • BN probabilities did not change for concepts that did not have instructional content • BN probabilities corresponded with Marines’ self-ratings of their level of knowledge (80% agreement)

  29. Current Work • Circuit analysis • Validating technique for use in Electrical Engineering gateway course • Rifle marksmanship • Integrated test of general approach • Compare DL system, coach, control conditions on shooting performance

  30. Summary and Conclusion • Distributed learning systems likely to increase in education and training contexts (K16, military, business) • The cognitive demands underlying performance tasks provides strong guidance for developing online measures • Extracting useful information from online behavior appears promising, but more research needed

  31. Backup

  32. Some 2000-01 Numbers • 56% of all postsecondary institutions offered distance education courses • 90% of public 2-year • 89% of public 4-year • 48% degree granting (und + grad) • 40% of private 4-year • 33% degree granting (und + grad) 2004 NCES Indicator 32

  33. Review Process • Reviewed 62 commercial and academic Web-based products • Data sources: online searches, existing reviews, and online learning trade publications • Criteria for inclusion in analyses: • System claimed to have Web-based testing capability • Broad criteria intended to maximize coverage of products

  34. Product/Vendor List Anlon BKM-elearning Blackboard Centra Class Act (Darascott's) Click2learn Computer Adaptive Technology Convene (IZIOPro) CyberWISE Docent E-college Edusystem eno.com e-path BuildKit Aud Managekit Eval First Class Generation21 iAuthor IMS Assessment Designer Infosource (content authoring tool) Interwise Millennium (enterprice communication platform) Intralearn Jones e-education Kenexa Knowledge Planet Learning Manager Learning Space Learnlinc/Testlinc Librix performance management (maritz) Macromedia (Authorware 6) Mentorware Microsoft LRN Toolkit MKLesson NCS Pearson Open Learning Agency of Australia Pedagogue Testing (Formal Systems) People Sciences People Soft Performance Assessment Network Pinnacle Plateau4 Learning Management System Platte canyon Prometheus Quelsys QuestionMark Perception RapidExam 2.0 Risc Saba Sage Smartforce Technomedia TEDS Learning on Demand THINQ Training Server TopClass Trainersoft 7 Professional TRIADS Tutorial Gateway Ucompass Educator Vcampus Virtual-U WBTmanager WebCT

  35. 2002 Review • Current Web-based systems provide tools for end-users to assemble, administer, and score tests containing mostly conventional item formats • Little support on how to develop quality tests, or how to use test information • Little support for performance assessments • Little support for diagnostic information • Weak support for linking instruction to test results

  36. Results N = 53

More Related