1 / 53

How do we know it works?

Learn how to create an evaluation plan for academic technology projects, identify relevant indicators, and develop data collection methods.

rash
Download Presentation

How do we know it works?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. How do we know it works? Evaluating Learning Technology Projects EDUCAUSE Learning Initiative Seminar Clare van den Blink cv36@cornell.edu

  2. Seminar Goals Select project goals that can be evaluated. Create an evaluation plan for an academic technology project that is tied to their project assumptions and strategies. Identifyrelevant indicators to evaluate project goals. Assess considerations for developing evaluation activities Identify data collection methods Seminar participants will be able to….

  3. Poll Please indicate the type of projects you’re interested in evaluating.

  4. Poll What are the key challenges in fully evaluating your projects?

  5. A Framework Project Goals Focus of Evaluation Evaluation Design Overview Indicators - measures of ”success” Data collection -Methods -Population -Procedures Timeline Data Analysis Reporting Findings

  6. Introduction How this process was developed… “To evaluate the effectiveness of the technology enhancement and it’s impact on student learning” …..With constraints of staff, time and budget limitations.

  7. Importance-Complexity High Quasi-experimental Research on Tech Intervention Staff Effort LMS Pilot Project-Service Decisions Small Project with technology intervention Low Complexity of Evaluation

  8. Importance-Complexity High Quasi-experimental Research on Tech Intervention Staff Effort LMS Pilot Project-Service Decisions Small Project with technology intervention Survey Low Interview Complexity of Evaluation

  9. Importance-Complexity High Quasi-experimental Research on Tech Intervention Staff Effort LMS Pilot Project-Service Decisions Surveys Small Project with technology intervention Interviews Usability Tech Review Survey Low Interview Complexity of Evaluation

  10. Importance-Complexity High Quasi-experimental Research on Tech Intervention Control group Staff Effort Surveys LMS Pilot Project-Service Decisions Interviews Observations Surveys Small Project with technology intervention Interviews Usability Tech Review Survey Low Interview Complexity of Evaluation

  11. To inform the evaluation What led to the development of the project? Based on Prior research? Literature review- What will inform the evaluation? Assumptions about the strategies & technologies selected?

  12. Selecting Goals…. Since not all projects may be evaluated within the timeframe of the project… OR may be difficult to measure…

  13. Selecting Goals…. Since not all projects may be evaluated within the timeframe of the project… OR may be difficult to measure… How can you SELECT goals that can be evaluated within the scope of the project?

  14. Selecting Goals…. Since not all projects may be evaluated within the timeframe of the project… OR may be difficult to measure… How can you SELECT goals that can be evaluated within the scope of the project? • How would you PRIORITIZE the goals that are the most critical to evaluate?

  15. Sample Goals:Review examples ofgoals and idenitfy goals that could be evaluated within the project constraints. EXAMPLE #1 Instructional Goals: 1.) Encourage active participation and critical thinking skills by using video clips of main stream movies to initiate class discussions. 2.) Encourage student involvement and active learning by creating a mechanism for students to record interviews in the field. (part of a class assignment) 3.) Create a repository of student-collected audio interviews for ongoing use in the curriculum. Audio clips will be used to illustrate the diversity of public education experiences. Other Project Goals:. 4.) Develop work flow and documentation for student recording of audio interviews and video clip processing. 5.) Choose, create, and provide an archiving mechanism for cataloguing clips. Activity

  16. Sample Goals:Review examples ofgoals and idenitfy goals that can be evaluated within the project constraints. EXAMPLE #2 Instructional Goals: A. ) Students will be able to practice application of fluid therapy, under various conditions, employing a unique computer-based simulation B.) Students will be able to interpret symptoms presented in a sick dog, select an appropriate treatment, administer fluids, monitor patient’s reaction and modify treatment plan accordingly. C.) Case simulation will enable students to experience clinical variability in a manner similar to hands-on practice. Other Project Goals: D.) Simplify the creation of a set of teaching models, or prototypes, that are the basis of the cases. E.) Create a method for generating unique computer-based cases that build from the prototypes. F.) Provide a method for saving case data for comparison. Activity

  17. Questions?

  18. Developing the evaluation plan

  19. Evaluation Process Identify Project Goals Survey Interviews Focus Groups Data analysis Observation Case Study

  20. Evaluation Process Identify Project Goals Focus of the Evaluation? What goals will be evaluated? Survey Interviews Focus Groups Data analysis Observation Case Study

  21. Evaluation Process Identify Project Goals Focus of the Evaluation? What goals will be evaluated? Indicators: type of data to collect? Survey Interviews Focus Groups Data analysis Observation Case Study

  22. Evaluation Process Identify Project Goals Focus of the Evaluation? What goals will be evaluated? Indicators: type of data to collect? Methods: how will data be collected? Survey Interviews Focus Groups Data analysis Observation Case Study

  23. Evaluation Process Identify Project Goals Focus of the Evaluation? What goals will be evaluated? Indicators: type of data to collect? Methods: how will data be collected? Population Survey Interviews Focus Groups Data analysis Observation Case Study

  24. Evaluation Process Process: Select Goals …Focus • What goals will be the Focus of the evaluation? Project Goals Use Personal Response System (polling) with questions to encourage critical thinking & student engagement. Use of PowerPoint presentations with interactive lecture. Implement use of a Tablet PC for annotating presentations for visual and interactive lectures

  25. Evaluation Process Process: Select Goals …Focus • What goals will be the Focus of the evaluation? What is feasible to evaluate in project’s timeframe? Project Goals Use Personal Response System (polling) with questions to encourage critical thinking & student engagement. Use of PowerPoint presentations with interactive lecture. Implement use of a Tablet PC for annotating presentations for visual and interactive lectures • Focus of the Evaluation • Use Personal Response System (polling) with questions to encourage critical thinking & active participation. • (student engagement)

  26. Evaluation Process Process: Select Goals …Focus • What goals will be the Focus of the evaluation? What is feasible to evaluate in project’s timeframe? • Identify what INDICATORS can be used to collect data, both indirect and direct measures. Project Goals Use Personal Response System (polling) with questions to encourage critical thinking & student engagement. Use of PowerPoint presentations with interactive lecture. Implement use of a Tablet PC for annotating presentations for visual and interactive lectures • Focus of the Evaluation • Use Personal Response System (polling) with questions to encourage critical thinking & active participation. • (student engagement)

  27. Evaluation Process Evaluation Focus: Example 1 Formative: Since the project is assisting with the development of online modules, a formative evaluation of the modules will be conducted to look at the interface design, navigation, usability, organization and presentation of content, and the usefulness for student learning. The key focus will be on the “functionality” of the module. • Interface / Navigation / Design (not important in phase I) • Technology performance: test across browsers, OS, distance, etc. • Organization/presentation of content • Use of images, illustrations, images • Learning objectives** Summative: (part of the overall program evaluation) Since the project is assisting with the development of web-based modules, the summative evaluation will examine the effect on student perception of learning from the implementation of instructional technology in the course. Measures of success may include the student perception of greater ease in learning difficult concepts, and positive feedback about new modules.

  28. Evaluation Process Select Methods Develop data collection METHODS for the indicators, such as surveys, interviews, observations, etc. What method(s) would you select evaluate the focus area- “functionality” of the online module? • Surveys • Interviews • Observation • Log analysis • All of the above • Other – post in chat Activity

  29. Evaluation Process Select Methods Develop data collection METHODS for the indicators, such as surveys, interviews, observations, etc What method(s) would you select to evaluate how well the online module met the “learning objectives”? • Surveys • Interviews • Observation • Log analysis • All of the above • Other – post in chat Activity

  30. Questions?

  31. Evaluation Process Other considerations • Identify the population from where the data will be collected. Can you contact this group? • Identity other data sources, such as logs, document, etc. • Are there considerations for human subjects research & informed consent on your campus? For example, using grades as data, what permissions are necessary?

  32. Evaluation Process Timelines: How much time do you have? Need? Fall Project Development Aug – Dec Fall Semester Project Implementation Dec-Jan Project Transition & Closeout Project Evaluation Planning Evaluation: Data analysis & reports Implement Project Evaluation

  33. Evaluation Process Evaluation Timeline Develop an initial timeline& staffing effort.

  34. Evaluation Process Evaluation Timeline Develop an initial timeline& staffing effort. Where reality meets ideal evaluation methods….

  35. Questions?

  36. Implementing the Plan & Reporting Results

  37. Evaluation Process Implementing Methods Surveys: • Identify or develop questions. • Do survey questions map to indicators? • Survey distribution & associated permissions. Interviews: • Develop interview questions & protocols. • Schedule and conduct interviews. Resources about quantitative and qualitative methods can guide development and implementation of methods and data analysis.

  38. Evaluation Process Analysis & Reporting DATA ANAYSIS What type of analysis will be completed? Quantitative: survey analysis Qualitative: interview analysis based on interview protocols. Example: Overall, I am satisfied with the use of instructional technology in this course. Mean = 1.2 1 Strongly Agree 82% 2 Agree 18% 3 Neutral 0 4 Disagree 0 5 Strongly Disagree 0 “I interviewed Prof. X about her experience with the email simulation…..”

  39. Evaluation Process Analysis & Reporting

  40. Questions?

  41. Constraints

  42. Evaluation Considerations & Tools Staffing How can this planning all be completed within limited staff hours, while maintaining the INTEGRITY of the evaluation process?

  43. Poll Do you think it is feasible to re-train staff for evaluation?

  44. Evaluation Considerations & Tools Staffing How can this planning all be completed within limited staff hours, while maintaining the INTEGRITYof the evaluation process? • How can staff be trained in this process without having a deep evaluation background? • What staff skills might be adapted? • Other campus resources?

  45. Poll What type of existing skills could be adapted for evaluation?

  46. Evaluation Considerations & Tools Supporting Tools • Have an overall evaluation plan template that an be adapted to other projects. • Informed consent templates • Use common interview/observation protocols. • Develop question banks for survey questions. • Use of Video in Course Presentations and lectures • Use of Online instructional tutorials • Use of Presentations in Lecture

  47. Evaluation Considerations & Tools question banks

  48. Evaluation Considerations & Tools question banks

  49. Summary

  50. Consider… • How can this METHODOLOGY be applied to your projects and institution? • How VIABLE is this as an evaluation methodology for your projects? When does a more ROBUST process need to be put in place?

More Related