1 / 90

Research Related to the Effectiveness E-Learning and Collaborative Tools

This article explores recent research on the effectiveness of e-learning and collaborative tools, including the benefits and challenges they present. It discusses various studies on distance learning, pedagogy, online tools, motivation, research problems, and evaluation methods for web-based instruction. The article aims to provide valuable insights into the current state of e-learning and its potential for improving education and training.

olivej
Download Presentation

Research Related to the Effectiveness E-Learning and Collaborative Tools

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Research Related to the Effectiveness E-Learning and Collaborative Tools Dr. Curtis J. Bonk Associate Professor, Indiana University President, CourseShare.com http://php.indiana.edu/~cjbonk, cjbonk@indiana.edu

  2. Are you ready???

  3. A Vision of E-learning for America’s Workforce, Report of the Commission on Technology and Adult Learning, (2001, June) • A remarkable 84 percent of two-and four-year colleges in the United States expect to offer distance learning courses in 2002” (only 58% did in 1998) (US Dept of Education report, 2000) • Web-based training is expected to increase 900 percent between 1999 and 2003.” (ASTD, State of the Industry Report 2001).

  4. Brains Before and After E-learning After Before And when use synchronous and asynchronous tools

  5. Tons of Recent Research Not much of it ...is any good...

  6. Tasks Overwhelm Confused on Web Too Nice Due to Limited Share History Lack Justification Hard not to preach Too much data Communities not easy to form Train and be clear Structure time/dates due Develop roles and controversies Train to back up claims Students take lead role Use Email Pals Embed Informal/Social Problems and Solutions(Bonk, Wisher, & Lee, in review)

  7. Shy open up online Minimal off task Delayed collab more rich than real time Students can generate lots of info Minimal disruptions Extensive E-Advice Excited to Publish Use async conferencing Create social tasks Use Async for debates; Sync for help, office hours Structure generation and force reflection/comment Foster debates/critique Find Experts or Prac. Ask Permission Benefits and Implications(Bonk, Wisher, & Lee, in review)

  8. Basic Distance Learning Finding? • Research since 1928 shows that DL students perform as well as their counterparts in a traditional classroom setting. Per: Russell, 1999, The No Significant Difference Phenomenon (5th Edition), NCSU, based on 355 research reports. http://cuda.teleeducation.nb.ca/nosignificantdifference/

  9. Question: Why is there no learning in e-learning??? A. Poor pedagogy? B. Inferior online tools? C. Unmotivated students and instructors? D. Poor research and measurement? E. Too new? F. Vendor and administrator visions do not match reality?

  10. Online Learning Research Problems (National Center for Education Statistics, 1999; Phipps & Merisotos, 1999; Wisher et al., 1999). • Anecdotal evidence; minimal theory. • Questionable validity of tests. • Lack of control groups. • Hard to compare given different assessment tools and domains.

  11. Online Learning Research Problems (National Center for Education Statistics, 1999; Phipps & Merisotos, 1999; Wisher et al., 1999). • Fails to explain why the drop-out rates of distance learners are higher. • Does not relate learning styles to different technologies or focus on interaction of multiple technologies.

  12. Online Learning Research Problems(Bonk & Wisher, 2000) • For different purposes or domains: in our study, 13% concern training, 87% education • Flaws in research designs - Only 36% have objective learning measures - Only 45% have comparison groups • When effective, it is difficult to know why - Course design? - Instructional methods? - Technology?

  13. Ten Primary ExperimentsAdaptations from Education to Training(Bonk & Wisher, 2000) 1) Variations in Instructor Moderation 2) Online Debating 3) Student Perceptions of e-Learning Envir. 4) Devel of Online Learning Communities 5) Time Logging 6) Critical Thinking and Problem Solving Applications in Sync/Asynchronous Envir 7) Peer Tutoring and Online Mentoring: 8) Student Retention: E-learning and Attrition 9) Conceptual Referencing 10) Online Collaboration

  14. Evaluating Web-Based Instruction: Methods and Findings (41 studies)(Olson & Wisher, in review) (Projected)

  15. Web Based Instruction CBI Kulik [8] CBI Liao [18] Average Effect Size .31 .32 .41 Number of Studies 11 97 46 Wisher’s Wish List • Effect size of .5 or higher in comparison to traditional classroom instruction. But reality:

  16. Evaluating Web-Based Instruction: Methods and Findings(Olson & Wisher, in review) “…there is little consensus as to what variables should be examined and what measures of of learning are most appropriate, making comparisons between studies difficult and inconclusive.”

  17. Evaluating Web-Based Instruction: Methods and Findings(Olson & Wisher, in review) What to Measure? • demographics (age, gender, etc.), • previous experience, • course design, • instructor effectiveness or feedback, • technical issues, • levels of participation and collaboration, • student and instructor interactions, • student recommendation of course, • student desire to take add’l online courses.

  18. Evaluating Web-Based Instruction: Methods and Findings(Olson & Wisher, in review) Variables Studied: • Type of Course: Graduate (18%) vs. undergraduate courses (81%) • Level of Web Use: All-online (64%) vs. blended/mixed courses (34%) • Content area (e.g., math/engineering (27%), science/medicine (24%), distance ed (15%), social science/educ (12%), business (10%), etc.) Other data: a. Attrition data collected (34%) b. Comparison Group (59%)

  19. Different Goals… • Making connections • Appreciating different perspectives • Students as teachers • Greater depth of discussion • Fostering critical thinking online • Interactivity online

  20. Learning Improved(Maki & Maki, 2002, Journal of Experimental Psychology: Applied, 8(2), 85-98) • Intro to Psych: Lecture vs. Online • Web-based course had more advantages as comprehension skill increased • Still students preferred the face-to-face over online • Why? More guidance, feedback, & enthusiasm, and less deadlines.

  21. Learning Improved…(Maki, Maki, Patterson, & Whittaker, 2000) • Intro to Psych: Lecture vs. Online • Online consistently higher exam scores • Online learned more as indicated by higher scores on psych graduate record exams during semester

  22. Learning Improved…(Maki et al., 2000) • Intro to Psych: Lecture vs. Online • Online performed better on midterms. • Web-based course students scored higher since had weekly activities due • Lecture students could put off reading until night before exam.

  23. Learning Worse(Wang & Newlin, 2000) • Stat Methods: Lecture vs. Online • No diffs at midterm • Lecture 87 on final, Web a 72 • Course relatively unstructured • Web students encouraged to collab • Lecture students could not collab • All exams but final were open book

  24. Learning Worse(Washull, 2001) • Psych: Lecture vs. Online • No diffs at midterm • Self-selected sections: Lecture 86 on final, Web a 77 • Random Assignment sections: No differences • Self-selected students more likely to fail the online course • Web course higher student satisfaction

  25. Learning Improved or Not…(Hiltz, 1993) • Web may be suited to some and lecture to others… • Students who find Web convenient for them score better. • Ratings of course involvement and ease of access to instructor also important.

  26. Learning Improved or Not…(Sankaran et al., 2000) • Students with a positive attitude toward Web format learned more in Web course than in lecture course. • Students with positive attitude toward lecture format learned more in lecture format.

  27. Electronic Conferencing: Quantitative Analyses • Usage patterns, # of messages, cases, responses • Length of case, thread, response • Average number of responses • Timing of cases, commenting, responses, etc. • Types of interactions (1:1; 1: many) • Data mining (logins, peak usage, location, session length, paths taken, messages/day/week), Time-Series Analyses (trends)

  28. Electronic Conferencing: Qualitative Analyses • General: Observation Logs, Reflective interviews, Retrospective Analyses, Focus Groups • Specific: Semantic Trace Analyses, Talk/Dialogue Categories (Content talk, questioning, peer feedback, social acknowledgments, off task) • Emergent:Forms of Learning Assistance, Levels of Questioning, Degree of Perspective Taking, Case Quality, Participant Categories

  29. AC3-DL Course Tools(Orvis, Wisher, Bonk, & Olson) • Asynchronous: • Learning Management System • E-mail • Synchronous: Virtual Tactical Operations Center (VTOC) (7 rooms; 15 people/extension) • Avatar • Audio conference by extension/room (voice over IP) • Text Chat Windows—global and private • Special tools for collaboration

  30. Overall frequency of interactions across chat categories (6,601 chats).

  31. Overall frequency of interactions across chat categories (6,601 chats).

  32. Research on Instructors Online • If teacher-centered, less explore, engage, interact(Peck, and Laycock, 1992) • Informal, exploratory conversation fosters risktaking & knowledge sharing(Weedman, 1999) • Four Key Acts of Instructors: • pedagogical, managerial, technical, social • (Ashton, Roberts, & Teles, 1999) • Instructors Tend to Rely on Simple Tools • (Peffers & Bloom, 1999) • Job Varies--Plan, Interaction, Admin, Tchg • (McIsaac, Blocher, Mahes, & Vrasidas, 1999)

  33. Study of Four Classes(Bonk, Kirkley, Hara, & Dennen, 2001) • Technical—Train, early tasks, be flexible, orientation task • Managerial—Initial meeting, FAQs, detailed syllabus, calendar, post administrivia, assign e-mail pals, gradebooks, email updates • Pedagogical—Peer feedback, debates, PBL, cases, structured controversy, field reflections, portfolios, teams, inquiry, portfolios • Social—Café, humor, interactivity, profiles, foreign guests, digital pics, conversations, guests

  34. Network Conferencing Interactivity (Rafaeli & Sudweeks, 1997) 1. > 50 percent of messages were reactive. 2. Only around 10 percent were truly interactive. 3. Most messages factual stmts or opinions 4. Many also contained questions or requests. 5. Frequent participators more reactive than low. 6. Interactive messages more opinions & humor. 7. More self-disclosure, involvement, & belonging. 8. Attracted to fun, open, frank, helpful, supportive environments.

  35. Starter Centered Interaction:

  36. Scattered Interaction (no starter): Week 4

  37. Collaborative Behaviors(Curtis & Lawson, 1997) • Most common were: (1) Planning, (2) Contributing, and (3) Seeking Input. • Other common events were: (4) Initiating activities, (5) Providing feedback, (6) Sharing knowledge • Few students challenge others or attempt to explain or elaborate • Recommend: using debates and modeling appropriate ways to challenge others

  38. Online Collaboration Behaviors by Categories (US and Finland)

  39. Dimensions of Learning Process(Henri, 1992) 1. Participation (rate, timing, duration of messages) 2. Interactivity (explicit interaction, implicit interaction, & independent comment) 3. Social Events (stmts unrelated to content) 4. Cognitive Events (e.g., clarifications, inferencing, judgment, and strategies) 5. Metacognitive Events (e.g., both metacognitive knowledge—person, and task, and strategy and well as metacognitive skill—evaluation, planning, regulation, and self-awareness)

  40. Some Findings (see Hara, Bonk, & Angeli, 2000) • Social (in 26.7% of units coded) • social cues decreased as semester progressed • messages gradually became less formal • became more embedded within statement • Cognitive (in 81.7% of units) • More inferences & judgments than elem clarifications and in-depth clarifications • Metacognitive (in 56% of units) • More reflections on exper & self-awareness • Some planning, eval, & regulation & self q’ing

  41. Surface Processing making judgments without justification, stating that one shares ideas or opinions already stated, repeating what has been said asking irrelevant questions i.e., fragmented, narrow, and somewhat trite. In-depth Processing linked facts and ideas, offered new elements of information, discussed advantages and disadvantages of a situation, made judgments that were supported by examples and/or justification. i.e., more integrated, weighty, and refreshing. Surface vs. Deep Posts(Henri, 1992)

  42. Critical Thinking (Newman, Johnson, Webb & Cochrane, 1997) Used Garrison’s five-stage critical thinking model • Critical thinking in both CMC and FTF envir. • Depth of critical thinking higher in CMC envir. • More likely to bring in outside information • Link ideas and offer interpretations, • Generate important ideas and solutions. • FTF settings were better for generating new ideas and creatively exploring problems.

  43. Unjustified Statements (US) 24. Author: Katherine Date: Apr. 27 3:12 AM 1998 I agree with you that technology is definitely taking a large part in the classroom and will more so in the future… 25. Author: Jason Date: Apr. 28 1:47 PM 1998 I feel technology will never over take the role of the teacher...I feel however, this is just help us teachers... 26. Author: Daniel Date: Apr. 30 0:11 AM 1998 I believe that the role of the teacher is being changed by computers, but the computer will never totally replace the teacher... I believe that the computers will eventually make teaching easier for us and that most of the children's work will be done on computers. But I believe that there…

  44. ID Indicators Examples 1 Social acknowledgement/ Sharing/Feedback ·Hello, good to hear from you ·I agree, good point, great idea ·   \ 2 Unsupported statements (advice) ·I think you should try this…. ·This is what I would do… · 3 Questioning for clarification and extend dialogue ·Could you give us more info? ·…explain what you mean by…? \\ 4 Critical thinking, Reasoned thinking-judgment ·I disagree with X, because in class we discussed…. ·I see the following disadvantages to this approach…. Indicators for the Quality of Students’ Dialogue(Angeli, Valanides, & Bonk, in review)

  45. Social Construction of Knowledge(Gunawardena, Lowe, & Anderson, 1997) • Five Stage Model 1. Share ideas, 2. Discovery of Idea Inconsistencies, 3. Negotiate Meaning/Areas Agree, 4. Test and Modify, 5. Phrase Agreements • In global debate, very task driven. • Dialogue remained at Phase I: sharing info

  46. Social Constructivism and Learning Communities Online (SCALCO) Scale.(Bonk & Wisher, 2000) ___ 1. The topics discussed online had real world relevance. ___ 2. The online environment encouraged me to question ideas and perspectives. ___ 3. I received useful feedback and mentoring from others. ___ 4. There was a sense of membership in the learning here. ___ 5. Instructors provided useful advice and feedback online. ___ 6. I had some personal control over course activities and discussion.

  47. Evaluation…

  48. Kirkpatrick’s 4 Levels • Reaction • Learning • Behavior • Results

More Related