1 / 21

Developing a Pedagogical Model for Evaluating Learning Objects

Developing a Pedagogical Model for Evaluating Learning Objects. Dr. Robin Kay and Dr. Liesel Knaack University of Ontario Institute of Technology Oshawa, Ontario. University of Ontario Institute of Technology. Opened Sept 2003 1 hour east of Toronto Focus on math & science

jovianne
Download Presentation

Developing a Pedagogical Model for Evaluating Learning Objects

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Developing a Pedagogical Model for Evaluating Learning Objects Dr. Robin Kay and Dr. Liesel Knaack University of Ontario Institute of Technology Oshawa, Ontario

  2. University of Ontario Institute of Technology • Opened Sept 2003 • 1 hour east of Toronto • Focus on math & science • Faculty of Education • 70 students (Year 1) • 90 students (Year 2 & 3) • 225 students (Year 4) • 3000 students in total • Ubiquitous computing

  3. Overview • Background • Defining Learning Objects • Previous Research • Our Approach • Our Scale • Sample & Procedure • Results • Conclusions • Future Research

  4. Background • A review of 58 articles • 11 studies focussed on evaluation, but only 2 evaluated learning • The “learning object” revolution will never take place unless instructional use and pedagogy are explored and evaluated (Muzio, Heins & Mundell, 2002; Richards, 2002; Wiley, 2000)

  5. Defining Learning Objects • Majority of researchers have emphasized technological issues: • accessibility, adaptability, the effective use of metadata, reusability, and standardization • A second definitional pathway is emerging • based on learning • A question of values: • learning object developers and designers • programmers • educators

  6. Our Definition (Values) • For our study, learning objects were defined as” “interactive web-based tools that support the learning of specific concepts by enhancing, amplifying, and guiding the cognitive processes of learners”

  7. Previous Research – 6 features • Description of Learning Object • Clear, often provide links • Varied (assessment, tutorials, interactivity) • Multiple Sources of Data • Surveys. Interviews, E-Mali, Tracking Use, Think-aloud-protocols • Focus on One Learning Object

  8. Previous Research – 6 features • Sample Size • Small, limited description, exclusively higher-education • Reliability & Validity • None • Formal Statistics • Largely absent • Emphasis placed qualitative data

  9. Our Approach • a large, diverse, sample of secondary school students • reliability and validity estimates calculated • formal statistics were used where applicable • specific learning objects features based on instructional design research were examined; • a range of learning objects was tested • evaluation criteria focussed on the learner, not the technology.

  10. Our Scale – Reliability = .87 7 Point Likert Scale • The learning object has some benefit in terms of providing me with another learning strategy/another tool. • I feel the learning object did benefit my understanding of the subject matter’s concept/principle. • I did not benefit from using the learning object. • I am interested in using the learning object again.

  11. Our Scale – Part 2 Quality – Content Analysis (ID) • You used a digital learning object on the computer. Tell me about this experience when you used the object. a) What did you like? (found helpful, liked working with, what worked well for you) b) What didn’t you like? (found confusing, or didn’t like, or didn’t understand) Perceived Benefit – Content Analysis • Do you think you benefited from using this particular learning object? Do you think you learned the concept better? Do you think it helped you review a concept you just learned? Why? Why not

  12. Sample and Procedure • 211 students, grades 9-12, 12 different high schools • 30 teachers, 21 pre-service, 9 experienced • Each teacher used one of 5 learning objects: • Biology • Computer Science • Chemistry • Math • Physics • Use learning object in a 70 minute period

  13. Results - Scales • Scale was reliable (r=.87) • Correlation among quantitative scale and qualitative results (r=.64, p <.001) – criterion validity • Rating of qualitative data – 100% agreement on quality and benefits content analysis

  14. Results – LO Quality

  15. Results – LO Benefits

  16. Results - Comparing LOs

  17. Results – Focus Group • Biology, Chemistry, Computer Science • Majority of suggestions were cosmetic • Math & Physics • Suggestions for change based on learning

  18. Conclusions • Formative analysis • just the beginning • Data collection instruments • reliable and valid • LO qualities • Research on instructional design categories provided a good base

  19. Conclusions • LO – Benefits • Graphics and interactivity • Learning is important to students • Comparing LOs • Tools sensitive to differences among LOS

  20. Future Research • Developed a new scale based on qualitative content analysis and further review of instruction design literature • Recently tested on grade 7 & 8 math students • Being tested on 48 classes, 1200 students, grades 5-12 • Further tests on 24 teachers – focussing on credit recovery and special needs students

  21. Contact Information Robin Kay robin.kay@uoit.ca Liesel Knaack liesel.knaack@uoit.ca

More Related