1 / 31

NECC 2007 - SAP126

NECC 2007 - SAP126. Program Evaluation Tools and Strategies for Instructional Technology. jsun@sun-associates.com 978-251-1600 ext. 204 www.sun-associates.com/necc2007 This presentation is linked to that site along with other resources on program evaluation

coby
Download Presentation

NECC 2007 - SAP126

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. NECC 2007 - SAP126 Program Evaluation Tools and Strategies for Instructional Technology

  2. jsun@sun-associates.com • 978-251-1600 ext. 204 • www.sun-associates.com/necc2007 • This presentation is linked to that site along with other resources on program evaluation • In particular, download a copy of the district evaluation workbook available on that page 2

  3. Where Do We Stand? • Who’s working on an actual project? • Current? • Anticipated? • Your expectations for today 3

  4. Workshop Goals • To review the key elements of effective program evaluation as applied to instructional technology evaluations • To consider evaluation in the context of some example projects 4

  5. Why Evaluate? • To fulfill program requirements • Evaluation is part of program/project accountability • Most other state and federal proposals require an evaluation component • And not simply a statement that “we will evaluate” • But actual info on who will evaluate, the evaluation questions, and methodologies • Project sustainability • Generation of new and improved project ideas • Others? 5

  6. By Definition, Evaluation… • Is both formative and summative • Helps clarify project goals, processes, products • Should be tied to indicators of success written for your project’s goals • Is not a “test” or simply a checklist of completed activities • Qualitatively, are you achieving your goals? • What adjustments can be made to your project to realize greater success? 6

  7. A Three-Phase Evaluation Process • Evaluation Questions • Tied to original project goals • Indicator rubrics • Allow for authentic, qualitative, and holistic evaluation • Data Collection • Tied to indicators in the rubrics • Scoring and Reporting • Role of the evaluation committee 7 pg 5 in workbook

  8. Who Evaluates? • Committee of stakeholders (pg 10) • Outside facilitator? • Task checklist (pg 6) • Other issues… • Perspective • Time-intensive 8

  9. Project Sample 9

  10. An Iterative Process • Evaluation breaks your vision down into increasingly observable and measurable pieces. 10

  11. Goals Lead to Questions • What do you want to see happen? • These are your goals • Rephrase goals into questions • Achieving these goals requires a process that can be measured through a formative evaluation 11

  12. …And Then to Indicators • What is it that you want to measure? • What are the conditions of success and to what degree are those conditions being met? • By what criteria should performance be judged? • Where should we look and what should we look for to judge performance success? • What does the range in the quality of performance look like? • How should different levels of quality be described and distinguished from each other? 12

  13. Indicators should reflect your project’s unique goals and aspirations • Rooted in proposed work • Indicators must reflect your own environment...what constitutes success for you might not for someone else • Indicators need to be highly descriptive and can include both qualitative and quantitative measures • You collect data on your indicators 13

  14. Try it on a Sample • Using the Evaluation Logic Map, map your: • Project purpose/vision • Goals • Objectives • Actions • We’ll take 15 minutes for this…and then come back for indicators 14

  15. 15

  16. 16

  17. Next…Indicators • Pick one of your intermediate outcomes • Write a brief statement of what it would LOOK LIKE to achieve ultimate success in this indicator. • What would change (go down) if success was less than ultimate? 17

  18. To Summarize... • Start with your proposal or technology plan • Logic map the connections between actions, objectives, and goals • From your goals/objectives, develop evaluation questions • Questions lead to indicators • Indicators are organized into rubrics • Data collection flows from that rubric 18

  19. Evidence/Data Collection • Classroom observation, interviews, and work-product review • What are teachers doing on a day-to-day basis to address student needs? • Focus groups and surveys • Measuring teacher satisfaction • Triangulation with data from administrators and staff • Do other groups confirm that teachers are being served? 19

  20. Data Collection Basics • Review Existing Data • Current technology plan • Curriculum • District/school improvement plans • Sample quesitons on the webpage for this presentation • www.sun-associates.com/eval/sample 20

  21. Surveys • Creating good surveys • Length • Differentiation (teachers, staff, parents, community, etc..) • Quantitative data • Attitudinal data • Timing/response rates (getting returns!) • www.sun-associates.com/eval/samples/samplesurv.html 21

  22. Online Survey Tools • Online • VIVED • Profiler • LoTi • Zoomerang • SurveyMonkey.com 22

  23. Survey Issues • Online surveys produce high response rates • Easy to report and analyze data • Potential for abuse • Depends on access to connectivity 23

  24. Focus Groups/Interviews • Focus Groups/Interviews • Teachers • Parents • Students • Administrators • Other stakeholders 24

  25. Classroom Observations • Using an observation template • Using outside observers 25

  26. Other Data Elements? • Artifact analysis • A rubric for analyzing teacher and student work? • Solicitation of teacher/parent/student stories • This is a way to gather truly qualitative data • What does the community say about the use and impact of technology? 26

  27. Dissemination • Compile the report • Determine how to share the report • School committee presentation • Press releases • Community meetings 27

  28. Conclusion • Build evaluation into your technology planning effort • Remember, not all evaluation is quantitative • You cannot evaluate what you are not looking for, so it’s important to — • Develop expectations of what constitutes good technology integration 28

  29. More Information • jsun@sun-associates.com • 978-251-1600 ext. 204 • www.sun-associates.com/evaluation • www.edtechevaluation.com 29

  30. jsun@sun-associates.com • 978-251-1600 ext. 204 • www.sun-associates.com/necc2007 • This presentation will be linked to that site 30

More Related