1 / 26

PGCAPP U6 – Evaluation

PGCAPP U6 – Evaluation. Geraldine Jones g.m.jones@bath.ac.uk. Learning Outcomes. design an evaluation plan including formulating an appropriate research question. (LO1)

galen
Download Presentation

PGCAPP U6 – Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. PGCAPP U6 – Evaluation Geraldine Jones g.m.jones@bath.ac.uk

  2. Learning Outcomes • design an evaluation plan including formulating an appropriate research question. (LO1) • appreciate the pros and cons of a range of data collection methods (including survey, interview, moodle logs) and methods of analysis. (LO1) • identify and take account of the ethical issues that may arise from your proposed evaluation (LO1) • select relevant e-learning research literature to inform your practice in HE (LO2) • appreciate the affordances of a range of learning technologies, including a rationale for using a multimedia discourse facilitation tool, the Bath wiki and Moodle activities that support group work. (LO3)

  3. Menu • Challenges of evaluating e-learning (5) • Task - your projects – finding a focus for evaluation (30) • Definitions and purposes of evaluation (10) • Task - your evaluation questions (10) • BREAK • Methods of data collection (10) • Data analysis (5) • Ethics – key concerns for U6 (5) • Task – ethics scenario analysis (10) • Spotlight on technologies for teaching (20) • Next steps & Feedback (5)

  4. ‘Learning technologies are not transparent, their processes not obvious and they do not broadcast their utility' (Salmon, 2005) http://www.flickr.com/photos/86096256@N00/431703388

  5. “Without asking hard questions about learning, technology remains an unguided missile” (Ehrmann, 1996a) http://www.flickr.com/photos/24656754@N00/3862379136

  6. Messiness.. It is difficult to separate a specific e-learning intervention from the complex interplay of cultural and social influences on the learning experience. http://www.flickr.com/photos/80764244@N00/4113566073

  7. Task – I min plans 1 – Describe any plans you have for your e-learning activity to the whole group. 2 – In pairs or threes with similar designs/technologies or context discuss the following questions: • What outcomes are you seeking from your learning activity? • What do you really want to know? • What aspects of the activity do you want to evaluate?

  8. Where is your focus? • pedagogical aims • learning and teaching processes • particular aspects of the learning environment. • technology itself, • the support surrounding the use of the technology.

  9. http://www.wordle.com

  10. A good evaluation will have explicit aims, address questions that are meaningful to these aims and to the intended audience, and use an appropriate mix of methods for gathering, analysing and presenting data

  11. Purposes of evaluation • diagnostic – learning from the potential users; to inform plans to change practice; • formative - learning from the process; to inform practice; • summative - learning from the product; to inform others about practice.

  12. Action Research

  13. Action Research Barrett and Whitehead (1985) ask six questions to help start an inquiry: • What is your concern? • Why are you concerned? • What do you think you could do about it? • What kind of evidence could you collect to help you make some judgement about what is happening? • How would you collect such evidence? • How would you check that your judgement about what has happened is reasonable, fair and accurate?

  14. Your question(s) “If you don't have a question you don't know what to do (to observe or measure), if you do then that tells you how to design the study” (Draper, 1996)

  15. What is the nature of student engagement? How does the effectiveness of the approach in learning gains compare with its efficiency in terms of teaching efforts (a form of cost-effectiveness) ? How confident were female students in discussing their own ideas online compared to their male counterparts? Does it work? Do students like it? Is easy to use ?

  16. Task – think of a question(s)… A few examples: • What do tutors/students think of the use of the wiki for a group work activity and what are the perceived benefits and limitations? • How has the forum discussion activity helped or hindered students achieve the learning outcomes (for the activity/unit)? • What impact has the role play activity had on student motivation? • How does an EVS supported activity promote deep learning? • How does a ‘Moodle workshop’ mediated peer assessment activity add or detract from the student learning experience? • How has formative feedback been improved through the use of a group project wiki? • How and when do students use a self-test quiz activity to support their learning on the unit?

  17. Methods of Data Collection • Surveys – post/pre-task survey (prior experience/motivation) * • Focus group/individual interviews * • VLE access logs * • Reflective diaries * • Observation and/or videotaping (e.g.Panopto) • A learning test (quiz) • Exam (assessment) performance Recommended resource: Evaluation cookbook http://www.icbl.hw.ac.uk/ltdi/cookbook/contents.html

  18. Example “recipe” Project Aim: To develop on-line self assessment materials for Y1 UG Biology large enrolment class. Evaluation purpose: To establish student use of and attitudes to on-line self assessment materials. Methods: Access logs, post activity survey (all students), post exam focus group (6 students, refreshments provided)

  19. Data analysis Online surveys Word clouds http://www.wordle.net/ http://www.survey.bris.ac.uk/

  20. Ethics is easy? • Do no harm • Be open • Be honest • Be careful Shank (2002) http://www.flickr.com/photos/61137578@N00/474514209

  21. Task • Discuss the evaluation scenarios and identify any ethical issues that could arise.

  22. Key concerns for U6 • Tutors power over students possibly influencing disclosure. What can be done to minimise this? • Informed consent What would you include on a consent form?

  23. “satisfactorily completing an ethics form at the beginning of a study and/or obtaining ethics approval does not mean that ethical issues can be forgotten, rather ethical considerations should form an ongoing part of the research.” Miller & Bell (2002)

  24. Spotlight on technology • Mahara for your design briefs • Voicethread - Multimedia discourse facilitation tool • Bath wiki • Moodle wikis and forums

  25. Key dates • Get feedback on your draft design brief – submit by 12th Dec. • Participant presentations, get peer feedback & support – 5min presentations on 19th Jan (need to move to 26th Jan) • Review session – 16th Feb • Final Submission - 22nd June

  26. References Cousin, G. (2009) Researching Learning in Higher Education. Oxon:Routledge Miller, J. and Bell, L. (2002) Connecting to what? Issues of access, gatekeeping and “informed” consent, In M. Mauthner, M. Birch, J. Jessop and T. Miller (Eds.), Ethics in qualitative research. London:Sage Oliver, M., Harvey, J., Connole, G. & Jones,A. (2007) Evaluation. In G. Connole & M. Oliver (Eds.) Contemporary Perspectives in E-learning Research. London:Routledge Shank, G.D. (2002) Qualitative Research: a personal skills approach. Columbus, OH: Merrill Prentice Hall.

More Related