150 likes | 301 Views
This document explores the significance of evaluation research in engineering education, emphasizing a systematic approach to assess program effectiveness, identify outcomes, and derive actionable lessons. It delves into a specific case study on Problem-Based Learning (PBL) tutor training, detailing the aims, objectives, and impact of the program. By developing indicators for progress and monitoring tutor training's appropriateness and sustainability, this guide highlights the importance of continuous data collection and dissemination of findings to enhance educational practices and facilitate student learning.
E N D
Evaluation Research and Engineering Education Lesley Jolly For AaeE ljolly@bigpond.net.au ERM wiki at www.aaee-scholar.pbwiki.com
What is evaluation research? • Periodic assessment of results • Appropriateness, effectiveness, efficiency, impact, sustainability • Identifies intended and unintended effects • Identifies what worked and what didn’t • Provides level of judgement about overall worth of intervention
What has it got to teach us? • Systematic approach to what we’re doing already • Extracting lessons learned • Legacy of sustained evaluation frameworks for ongoing data collection • Arguments against ‘popularity contest’ course and program evaluation
Step 1: describe the program • Aims and objectives • Need to be clearly articulated • Does everyone involved share the same aims • Program logic diagram • Describes how we think the program produces results • May be called logframe
Monitoring facilitates data collection • Can be process (outputs) or impact (short to medium outcome) monitoring • Need to develop indicators of progress • Targets may be included in indicators or separate e.g. • In 2010 50% of 2nd yrs will have used the new facility for more than 20 hrs • OR percentage of students using facility (indicator) • 50% in 2010 (target)
PBL Tutor training • GOAL: provide timely well-placed supportive guidance to encourage tutors to scaffold and facilitate student learning • OBJECTIVES: at the end of training the successful tutor will be able to • articulate a good understanding of the objectives and methods of PBL • guide student learning through providing appropriate support and guidance rather than information • contribute to curriculum development within a staff team
Evaluation asks formative and summative questions • Are the questions cohesive and logical? • Do evaluation questions link to monitoring data? • Have ethical issues been addressed? • What mechanisms are in place to gather the learnings generated by evaluation? • What needs to be retained from this evaluation process in future years?
PBL Tutor Training • Appropriateness • Did the training model the target behaviour? • Has the purpose of the training been achieved? • Indicator 1: changes in Tutor behaviour indicating deeper knowledge of and commitment to PBL • Indicator 2: changes in student behaviour • Efficiency • Was the time invested by staff good value? • Sustainability • what needs to retained as core material from year to year to retain benefit of training
Making use of evaluation Owen, J. (2006) "Program Evaluation" Allen & Unwin: Crows Nest
Dissemination and reporting • Findings may be communicated throughout project to multiple audiences • Evidence, conclusions, judgements, recommendations. • Different occasions call for different styles • Oral, interactive workshops, posters, reports, summaries, papers, conference presentations • Must be well timed
Developing capacity • “process use of evaluation” • Taking part develops skills • Taking part sensitises staff to issues • Improved communication • Ongoing 360 degree dissemination • Development of local discourse