1 / 41

Building Electronic Support Environments for First-Year University Students

Building Electronic Support Environments for First-Year University Students. Speakers (in order of appearance) Philip Barker – Theoretical Background (7 min) Oladeji Famakinwa - The EPSILON System (Design and Development) (7 min) Paul van Schaik – System Evaluation (7 min).

loring
Download Presentation

Building Electronic Support Environments for First-Year University Students

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Building Electronic Support Environments for First-Year University Students Speakers(in order of appearance) Philip Barker – Theoretical Background (7 min) Oladeji Famakinwa - The EPSILON System (Design and Development) (7 min) Paul van Schaik – System Evaluation (7 min)

  2. Theoretical Background Philip Barker School of Computing

  3. Designing Learning Spaces Three broad categories of learning environment to consider: • On-campus Systems • Online Learning • Ad hoc Spaces Various combinations of conceptual and pragmatic resource categories can be used to construct these. What combinations (or ‘blends’) should be used to construct optimal learning environments?

  4. Theoretical Background

  5. Designing Online Learning E-Learning a X b c KnowledgeManagement Performance Support

  6. Theoretical Perspective on EPSS • All people have both physical and cognitive limitations • The Power Law of Practice can be used to predict performance plateaus • These can be overcome through the design of appropriate performance improvement or augmentation aids • Most commonly these are simple tools, machines and intelligent devices

  7. Expert Band Novice Band Performance Plateaus and Bands Skill Level Time

  8. Objectives of Performance Support A performance support system can provide one or other (or all) of the following functions: • Improvement in on-the-job task performance • Provision of data, information or knowledge at a point of need • Skill and knowledge enhancement facilities

  9. General Systems Theory We are currently interested in performance issues relating to users of academic libraries(our ‘systems’)

  10. Electronic Performance Support System EPSS Scaffolding LIBRARY B C A System Interface USER Mental Models EPSS and Scaffolding • The system we have been building utilises five different types of digital object: • data objects • information objects • knowledge objects • learning objects • performance objects

  11. The Epsilon System (Design and Development) Oladeji Famakinwa School of Computing

  12. The Epsilon System (Design and Development) • Epsilon or Electronic Performance Support Systems for Libraries • First prototype • Assist students with improving their performance at doing typical library tasks such as locating books and journals • Second prototype • Tutorial Module: Impacting knowledge of library classification systems • Games Module: Developing searching strategy skills for finding books

  13. The Tutorial Module

  14. The Tutorial Module

  15. The Games Module – Game 1

  16. Game 1

  17. Game 1

  18. Game 1

  19. The Games Module – Game 2

  20. Evaluation and Data Collection • Non-first-year evaluation (N = 20) • Conducted using paper-based workbook with the online Epsilon system • Participants constantly switching between filling in workbook and using computer • At the end of the evaluation information from workbook had to be converted into electronic format for analysis • First-year evaluation (N = 99) • Online workbook integrating evaluation instructions and guide, electronic questionnaire and Epsilon system • Participants focused on following instructions, carrying out tasks and filling in information on computer • Information already in electronic format ready for analysis

  21. System Evaluation Paul van Schaik School of Social Sciences and Law

  22. Evaluation Studies • Previous evaluation • Non-first-year students • Staff • Current evaluation • First-year students

  23. Evaluation with first-year students • Three-group, pre-test/post-test design to establish the effect of using the two EPSS components • Group 1 only studied the tutorial component • Group 2 only played the game component • Group 3 both tutorial and game • Outcome measures • Knowledge of the library classification system • Confidence in knowledge • System acceptance

  24. Participants • First-year psychology students • Ninety-nine students: 32 in Group 1, 39 in Group 2 and 28 in Group 3 • Mean age = 21 (SD = 6.36) • Twenty-one male • Mean years of computer experience = 10.78 (SD = 3.67) • Mean computer use per week = 16.01 hrs (SD = 13.38) • Mean Web use per week = 8.28 hrs (SD = 8.15)

  25. Materials and equipment • Computer-based experiment/evaluation • Study tutorial and/or play games components (depending on group) • All data collection through experimental software - no separately printed workbook • Instructions for using the tutorial component and playing the games • Pre-test and post-test • Pages for recording task performance on the games • Questionnaire: • Demographic information • System acceptance

  26. Procedure • Participants worked through the computer-based experiment/evaluation • This included studying the tutorial and/or playing the games, and completing sections of the experimental software when appropriate

  27. Results - Knowledge

  28. Knowledge (2) • Mean knowledge scores (pre-test and post-test) were around 60% • Analysis of covariance (ANCOVA) was used to establish the effect of EPSS component (tutorial, game, tutorial and game) on post-test knowledge after statistically controlling for pre-test knowledge • The covariate, pre-test knowledge, was significantly related to post-test knowledge, F (1, 95) = 43.11, p < .001, r = .57 • There was no effect of EPSS component on post-test knowledge after controlling for pre-test knowledge, F (2, 95) = 1.72, p > .05

  29. Confidence

  30. Confidence (2) • Mean confidence scores (pre-test and post-test) varied widely from 71% to 88% • ANCOVA demonstrated that the covariate, pre-test confidence, was significantly related to post-test confidence, F (1, 95) = 69.36, p < .001, r = .63 • The effect of EPSS component on post-test confidence after controlling for pre-test confidence was significant, F (2, 95) = 7.28, p < .005, 2 = .07 (medium effect size) • This result reflects the relatively high increase in confidence in Groups 1 and 3 compared to Group 2, t1 (95) = 3.74, r = .41, p < .001, t3 (65) = 2.39, r = .28, p < .05 (Bonferroni correction applied)

  31. Correlations

  32. Acceptance • Reliability analysis • Perceived usefulness • Tutorial; Cronbach’s alpha = .95 • Game; alpha = .95 • Intention to use • Tutorial; alpha = .67 • Game; alpha = .78 • Overall scores were calculated by averaging over items • 95% confidence intervals of mean (range from -3 through +3) • Perceived usefulness • Tutorial; CI.95 = [0.34; 1.57], mean = .96 • Game; CI.95 = [0.24; 1.48], mean= .86 • Intention to use • Tutorial; CI.95 = [0.44; 1.38], mean = .91 • Game; CI.95 = [ 0.58; 1.50], mean = 1.04

  33. Comments from staff • "They enjoyed it, they really liked it, and it really challenged them." • "The students felt this was great and they want to do it again. Look at it again". Hence we provided links to the tutorial and gaming components. • Students have come back to say they have been able to find books in the library. • Too generic particularly for first year students. The exercise should be more specific to their library needs.

  34. Follow-up evaluation (after 3 months) • N = 43 • Measured knowledge and confidence again

  35. Follow-up evaluation (after 3 months) (2) • Pre-use versus follow-up • Confidence: t (140) = 3.22, r = .26, p < .01 • Knowledge: t (140) = 3.76, r = .30, p < .001 • Post-use versus follow-up • Confidence: t (140) = 2.36, r = .20, p < .05 • Knowledge: t (140) = 0.29, r = .02, p >> .05 • Correlations between correctness and confidence • Pre-use: r = .04, p >> .05 • Post-use: r = .19, p > .05 • Follow-up: r = .41, p < .05

  36. Discussion • Results from non-first-year evaluation confirmed • However, results on confidence significant this time

  37. General discussion • EPSS well received by student-users and staff • Users found both EPSS components useful • Tutorial component enhanced users’ confidence in their own knowledge

  38. Future developments • EPSS for library classification • Include sound and speech audio • Dynamic adaptability of content; make EPSS more suited to • specific libraries • academic disciplines • Integration with existing library facilities and services and VLEs • EPSS for students in ‘remote’ locations (including ad-hoc spaces)

  39. Evaluation Work and Publications • van Schaik, P., Barker, P. & Famakinwa, O. (2005). Electronic performance support for learning to use academic libraries. Learning and teaching conference 2005. University of Teesside. • Barker, P., van Schaik, P. & Famakinwa, O. (2005). Potential roles for electronic performance support systems in libraries. Proceedings of Computer-Based Learning in Science (CLBIS) 2005. University of Zilina, Slovakia. • Also, conference paper presentation at CBLIS 2005 (PGB). • Barker, P. , van Schaik, P. & Famakinwa, O. (2006). Designing Learning Objects to Enhance Students’ Performance. Presentation at Learning and Teaching Conference. University of Teesside. • van Schaik, P., Barker, P. & Famakinwa, O. (2006a). Making a case for using electronic performance support systems in academic libraries. Journal of Interactive Learning Research. In press. • van Schaik, P., Barker, P. & Famakinwa, O. (2006b). Potential roles for performance support tools within library systems. The Electronic Library. In press.

  40. Acknowledgements • The authors wish to express their gratitude to the University of Teesside and the Higher Education Academy for financial assistance to support this work • We are also indebted to the enthusiastic support they were given by the staff at the University of Teesside’s Learning and Information Services. We would particularly like to mention Ian Butchart, Sue Myer and Barbara Hull • We are also grateful to Jan Anderson for allowing us to conduct the evaluation study with her students • We also thank Andrew Young for assistance with the coding of the computer games that were used in this study.

  41. External Links • Epsilon Modules • http://sssl-staffweb.tees.ac.uk/u0011128/epss/

More Related