1 / 43

Planning and Sustaining Evaluation of Instructional Technology Support Programs

Planning and Sustaining Evaluation of Instructional Technology Support Programs . Yvonne Belanger, Duke University Joan Falkenberg Getman, Cornell University Lynne O’Brien, Duke University. http://dls.cornell.edu. http://cit.duke.edu. Our assumption.

aisha
Download Presentation

Planning and Sustaining Evaluation of Instructional Technology Support Programs

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Planning and Sustaining Evaluation of Instructional Technology Support Programs Yvonne Belanger, Duke University Joan Falkenberg Getman, Cornell University Lynne O’Brien, Duke University http://dls.cornell.edu http://cit.duke.edu

  2. Our assumption • How can we build a culture of evaluation, so that many people contribute to evaluation? • How can we provide a context for evaluation strategies and results? • How can we conduct evaluation that helps with decision making? Schools want to know if IT services, organizations & projects are effective, but have limited resources for evaluation.

  3. Overview • Key issues in evaluation planning • Early planning for evaluation at Cornell • General approaches to evaluation at Duke • Case studies: Duke iPod project, Duke Faculty Fellows • Resources and templates

  4. Assessment v. Evaluation • Assessment is an ongoing process aimed at understanding and improving student learning. • Evaluation is a judgment or determination of the quality of a performance, product or use of a process against a standard. Did it work in terms of the needs being addressed or the system goal? Article: Differentiating Assessment from Evaluation as Continuous Improvement Tools by Peter Parker, Paul D. Fleming, Steve Beyerlin, Dan Apple, Karl Krumsieg

  5. Why Evaluate? “Research is aimed at truth. Evaluation is aimed at action.” Michael Patton

  6. Cornell University • Private, 4 year, Research 1 • New York State land-grant institution • Partner of the State University of New York • 11 schools: 7 undergraduate and 4 grad/professional • 19,600 FTE students, 1540+ faculty http://www.cornell.edu

  7. Cornell Computing Environment • IT centrally and locally supported • Undergraduate education is campus based • CIT strategic plan encourages selective innovation • President’s “Call to Action” • Provost’s Distributed Learning Initiative

  8. Provost’s Distributed Learning Initiative • Core technologies development • Faculty development and training • Faculty Innovation in Teaching (new) • Lynx Student Assistant Program (new)

  9. http://innovation.cornell.edu

  10. http://lynx.cornell.edu

  11. Key Issues in Planning an Evaluation • Who are the stakeholders and what do they want to know? • What is success? • How will you measure success? • Do you have an evaluation team or partner? • What is the most effective way to report evaluation findings? • Are you allowing for discovery as well as confirmation? • How is the data going to guide decision-making and improvements?

  12. Provost President Vice President of Information Technology Faculty Students Deans Dean of faculty Dean of students Library Center for learning and Teaching Cornell Adult University Faculty Advisory Board on Information Technology IT staff and Helpdesk staff Executive Budget and Finance Committee Different Stakeholders Different Interests

  13. Defining Success • Identify different dimensions or domains for evaluation. • Identify indicators of success in those domains. • Data collection method and source of data will vary with indicators of success.

  14. Measuring Success Project: Student response systems (polling) in large enrollment class. Goals: Improve learning, implement inexpensive, low-maintenance technology with specific functionality, increase student engagement, short learning curve for faculty, adoption of polling by other large enrollment classes. Domains: • Instructional: strategies, learning outcomes • Technology: functionality, reliability… • Student experience: attitude, use of technology… • Faculty experience: attitude, use of technology… • Programmatic impact • Cost

  15. Balanced View of Success Domains: • Instructional: strategies, learning outcomes • Technology: functionality, reliability… • Student experience: attitude, use of technology… • Faculty experience: attitude, use of technology… • Programmatic impact • Cost Outcomes: Students: like it in several ways and they self-report improved learning Faculty: too much time in prep, tech not meeting needs, still like the idea IT staff: User support for faculty and facilities taxing limited staff time Finance Office: clicker replacement and new projection system beyond budget Was the project a success?

  16. Evaluation Models and Standards • Scientific inquiry and experimental models Emphasizes values established by research community • Management-oriented models Emphasizes decision-making: Stufflebeam’s CIPP model • Qualitative and Anthropological models Emphasizes discovery of values based on description • Participation-oriented models Emphasizes values being "socially constructed" by the community

  17. Stufflebeam’s CIPP Model Context, Input Process and Product evaluation • Focus: decision-making • Purpose: facilitate rational and continuing decision-making • Evaluation activity: identify potential alternatives, set up quality control systems

  18. Action Research Action research is deliberate, solution-oriented investigation that is group or personally owned and conducted. It is characterized by spiraling cycles of problem identification, systematic data collection, reflection, analysis, data-driven action taken, and, finally, problem redefinition. The linking of the terms "action" and "research" highlights the essential features of this method: trying out ideas in practice as a means of increasing knowledge about and/or improving curriculum, teaching, and learning (Kemmis & McTaggart, 1982).

  19. Permissions and Partners • Check with your institution’s research office for policies on human subject research. • Be creative - put together an evaluation team or partnership - involve stakeholders for credibility.

  20. Reporting Evaluation Results • Format your information and customize your report to stakeholders so that it meets their interests and style. • Narrative • Video interviews • PowerPoint presentation • Excel spreadsheets • Images, graphical representation of numerical data Include unexpected outcomes Use benchmark studies for additional context

  21. Focus on the Intent of Evaluation • Evaluation uses a combination of data to present a comprehensive picture • Return to original purpose of the evaluation and the types of decisions the data will inform. • It is possible for a project or program to have some components that succeeded and others that did not

  22. Duke University • Private, 4 year, Research I • 9 schools: undergrad and professional • 12,000 FTE students, 2,350 faculty

  23. Duke Center for Instructional Technology • Established 1999 in response to a general needs assessment on instructional technology at Duke • Goals: increase faculty and student use of technology, leverage resources, coordinate planning

  24. IT is both central & school-based Growing interest in distance ed in professional schools Undergrad ed = campus based classroom teaching Strategic plan encourages IT experimentation Duke CIT Context

  25. Experiments with laptops, Blackboard, PDA’s, iPods and other technologies

  26. Is the CIT doing a good job? Do students learn more when they use iPods? What is the best way to help faculty make good use of technology? Is Blackboard a success? The big questions

  27. Answerable questions • Is CIT making positive changes in the areas identified by the original needs assessment? • Do iPods improve course logistics and increase student access to a rich set of course materials? • Are faculty satisfied with the IT development programs they use? • How widely used is Blackboard, and what new kinds of teaching does it enable?

  28. Tools for Structuring Evaluation • CIPP and Logic Modeling • Context: Environment & Needs • Input: Strategies & Resources • Process: Monitoring implementation • Product: Outcomes - both quality and significance • Logic Modeling

  29. CIPP View of Institutionalized Evaluation Stufflebeam, OPEN, 2003

  30. CIPP approach recommends… • Multiple observers and informants • Mining existing information • Multiple procedures for gathering data; cross-check qualitative and quantitative • Independent review by stakeholders and outside groups • Feedback from Stakeholders • Be appropriately circumspect in generating and reporting conclusions

  31. Faculty Fellows Program Goals: • Faculty Development • Department Development • Intensive orientation • Occasional meetings • One-on-one consulting • Showcase presentation

  32. Evaluating the Fellows Program • Stakeholder and staff input to clarify program goals • Developing consistent reporting tools • Distributing effort • Stakeholder review of outcomes • Participant responsibility for disseminating results Evaluation of Instructional Technology Fellows Program

  33. Re-envisioning the Fellows • Full week of orientation →1-2 days + 4 additional short meetings • Single project focus → Multiple small scale activities • Customized individual project → theme-based offering

  34. Duke iPod First-Year Experiment • Technology innovation • Student life, campus community • Academic impact Project goals

  35. Distributed 1,599 20 GB iPod devices to first-year students on Aug. 19, 2004

  36. Evaluation Challenges • Baseline info unavailable • Iffy implementation of instructors’ course evaluation plan • How best to capture academic projects outside of CIT purview • Quick start - experimentation; outcomes vs. predefined goals • Proving correlation between iPod use and improved course outcome

  37. Focusing the evaluation of academic iPod use • Feasibility of using iPod to support teaching and learning • Improving logistics of course delivery • Enhancing student learning and interest

  38. Sharing preliminary information • Crucial to have early understanding of project lessons • Matrix of evaluation strategies • Grouping uses into similar cases • Examples: • Summary of iPod projects and their evaluation strategies • Early feedback on uses and lessons learned Available at http://cit.duke.edu/evaluation

  39. Other Resources & Templates • http://cit.duke.edu/evaluation • Annotated bibliography by Cornell and Duke • Sample CIT reports • CIT Logic Model example and template http://www.innovation.cornell.edu • How can we all share more information about our activities and learn more from one another’s successes and failures?

  40. Summary • Understand what success is for your efforts • Reframe questions to be answerable • Focused rather than comprehensive evaluation • Build culture through distributed team approach • Bring context and input into evaluation • Take a formative view

  41. Thank You! Yvonne BelangerProgram Evaluator, Duke Center for Instructional Technology yvonne.belanger@duke.edu Joan GetmanAssistant Director, Distributed Learning Services,Cornell Informaiton Technologiesjmf4@cornell.edu Lynne O’BrienDirector, Duke Center for Instructional Technologylynne.obrien@duke.edu

More Related