1 / 46

Data Collection Techniques

Data Collection Techniques. For Technology Evaluation and Planning. Contact Information. jsun@sun-associates.com 978-251-1600 ext. 204 www.edtechevaluation.com This presentation will be linked to that site (on the Tools page). Where Do We Stand?. Who’s working on an actual project?

hua
Download Presentation

Data Collection Techniques

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Data Collection Techniques For Technology Evaluation and Planning

  2. Contact Information • jsun@sun-associates.com • 978-251-1600 ext. 204 • www.edtechevaluation.com • This presentation will be linked to that site (on the Tools page)

  3. Where Do We Stand? • Who’s working on an actual project? • Current? • Anticipated? • Your expectations for today

  4. Objectives • To review the key elements of effective program evaluation as applied to technology evaluations • Understanding the role of data collection in an overall evaluation process • Reviewing various data collection strategies

  5. Why Evaluate? • To fulfill program requirements • NCLB and hence Title IID carry evaluation requirements • One of the 7 seven program requirements for NY Title IID Competitive Grants • “Each grantee will be required to develop “process and accountability measures” to evaluate the extent to which activities funded are effective in (1) integrating technology into curricula and instruction; (2) increasing the ability of teachers to teach; and (3) enabling students to meet challenging State standards. Records relating to these “process and accountability measures” are to be made available on request to the NYS Education Department (or its agents).”

  6. Project evaluation is also required as an overall part of each proposal… • “Describe the plan for evaluating the effectiveness of the competitive grant project. The plan should include clear benchmarks and timelinesto monitor progress toward specific objectives and outcome measures to assess impact on student learning and achievement. It must address the extent to which activities funded are effective in (1) integrating technology into curricula and instruction; (2) increasing the ability of teachers to teach; and (3) enabling students to meet challenging State standards.” • 10% of the points…10% of the budget?

  7. A Framework for Review

  8. Evaluation • Helps clarify project goals, processes, products • Must be tied to indicators of success written for your project’s goals • Not a “test” or checklist of completed activities • Qualitatively, are you achieving your goals? • What adjustments to can be made to your project to realize greater success?

  9. The Basic Process • Evaluation Questions • Tied to original project goals • Performance Rubrics • Allow for authentic, qualitative, and holistic evaluation • Data Collection • Tied to indicators in the rubrics • Scoring and Reporting • Role of this committee (the evaluation committee)

  10. Who Evaluates? • Committee of stakeholders (pg 13) • Outside facilitator? • Data collection specialists? • Task checklist (pg 11)

  11. Data Collection vs. Evaluation • Evaluation is more than data collection • Evaluation is about… • Creating questions • Creating indicators • Collecting data • Analyzing and using data • Data collection occurs within the context of a broader evaluation effort

  12. Evaluation Starts with Goals • Evaluation should be rooted in your goals for how you are going to use or integrate technology • A logic map can help highlight the connections between your project’s purpose, goals, and actions • And actions form the basis for data collection! • pg 15

  13. Example Project Logic Map

  14. Goals Lead to Questions • What do you want to see happen? • These are your goals • Rephrase goals into questions • Achieving these goals requires a process that can be measured through a formative evaluation

  15. We Start with Goals… • To improve student achievement through their participation in authentic and meaningful science learning experiences. • To provide advanced science and technology learning opportunities to all students regardless of learning styles or abilities. • To produce high quality science and technology curriculum in which the integration of technology provides “added value” to teaching and learning activities. • To increase students’ knowledge of the Connecticut River’s history and geology, and to gain and understanding its past, present and possible future environmental issues.

  16. …and move to questions • Has the project developed technology-enhanced science learning experiences that have been instrumental in improving student mastery of the Skills of Inquiry, understanding of the history/geology/ecology of the Connecticut River, and of the 5-8 science curriculum in general? • Has the project offered teacher professional development that has resulted in improved teacher understanding of universal design principles and technology integration strategies?

  17. …And Then to Indicators • What is it that you want to measure? • Whether the projects have enhanced learning • The relationship between the units and • The selected curriculum • The process by which they were developed • Increases in teacher technology skills (in relation to particular standards) • Whether the professional development model met with its design expectations • Collaborative and sustainable • Involves multiple subjects and administrators

  18. Indicators should reflect your project’s unique goals and aspirations • Rooted in proposed work • Indicators must be indicative of your unique environment...what constitutes success for you might not for someone else • Indicators need to be highly descriptive and can include both qualitative and quantitative measures • You collect data on your indicators

  19. Evidence? • Classroom observation, interviews, and work-product review • What are teachers doing on a day-to-day basis to address student needs? • Focus groups and surveys • Measuring teacher satisfaction • Triangulation with data from administrators and staff • Do other groups confirm that teachers are being served?

  20. Data Collection • Review Existing Data • Current technology plan • Curriculum • District/school improvement plans • Others?

  21. Tools and Techniques • Surveys • Interviews • Observations • Artifact Analysis

  22. Surveys • Online vs. Paper • Is there sufficient connectivity? • Doesn’t have to be at the classroom level • Often works best if people complete the instruments all at the same time • Same goes for paper surveys • Online surveys provide immediate data • Spreadsheets which can be exported to a variety of different programs for analysis

  23. Surveys • Online • VIVED • Profiler • LoTi • Zoomerang • SurveyMonkey.com

  24. Make Your Own! • www.sun-associates.com/neccsurv.html • Based on a CGI script on your webserver • Outputs to a text file, readable by Excel • Works with yes/no, choose from a list, and free text input (no branching) • www.sun-associates.com/surveyws/surveys.html

  25. Survey Tips • Keep them short (under 10 minutes) • Avoid huge long checklists • Allow for text comments • Support anonymity • But allow for categorical identifications -- school, job function, grade, etc.

  26. Coordinate and support survey administration • Avoid the “mailbox stuffer” • Work with building leaders • Provide clear deadlines

  27. Three Big Points • Surveys alone mean nothing • TRIANGULATE! • 100% response rate is virtually impossible • On the other hand, nearly 100% is very possible if you follow our tips! • Share the data • No one wants to fill in forms for no purpose

  28. Interviews • Serve to back up and triangulate survey data • Less anonymous than surveys • Mixed blessing... • Allows for immediate follow-up of interesting findings

  29. Interviewing Tips • As homogenous as feasible • By grade, job function, etc. • Be attentive to power structures • Don’t mix principals with teachers; tech coordinators with teachers; central office staff with principals; etc.

  30. Use outside interviewers • People will explain things to us (because they have to!) • We avoid the power structure issues • We’ve done this before • Structure and focus the interviews • Use a well-thought-out and designed protocol • Only diverge after you’ve covered the basic question

  31. Three Big Points • Create protocols after you’ve seen survey data • Homogeneity and power • Use outsiders to conduct your interviews

  32. Observations • The third leg of your data triangle • Surveys - Interviews - Observations • Familiar yet different • You’ve done this before...but not quite • Progressively less “objective” than surveys and interviews

  33. Observation Tips • Insure that teachers understand the point and focus of the observations • You’re evaluating a project, not individuals!! • Sample • You can’t “see” everything • So think about your sample • You can learn as much from an empty classroom as an active one • Look at the physical arrangement of the room • Student materials • How is this room being used?

  34. Outside observers are necessary unless you simply want to confirm what you already know • Avoid turning observations into a “technology showcase” • Showcases have their place -- mostly for accumulating and reviewing “artifacts” • But the point of observations is to take a snapshot of the typical school and teacher

  35. Three Big Points • Observe the place as well as the people • Observations are not intended to record the ideal...rather, the typical • Use outside observers

  36. Artifact Analysis • Reviewing “stuff” • Lesson plans • Teacher materials • Student work • Create an artifact rubric • Not the same as your project evaluation indicator rubric

  37. 10 Tips for Data Collection • Challenge your assumptions • But also don’t waste time by asking the obvious • Cast a wide net • It’s all about stakeholders • Dig deep • Try to collect the data that can’t easily be observed or counted

  38. Use confirming sources • Triangulate! Surveys alone do nothing. • Have multiple writers • Stakeholders and different perspectives • Think before you collect • Choose questions carefully and with regard to what you really expect to find

  39. Set (reasonable) expectations for participation • Time and effort • Forget about mailbox surveys • Usually waste more time than their value • Report back • Don’t be a data collection black hole!

  40. It’s a process, not an event! • It does little good to collect data once and then never again • Data collection is part of a long-term process of review and reflection • Even if the immediate goal is only to get “numbers” for the state forms

  41. Dissemination • Compile the report • Determine how to share the report • School committee presentation • Press releases • Community meetings

  42. Conclusion • Build evaluation into your technology planning effort • Remember, not all evaluation is quantitative • You cannot evaluate what you are not looking for, so it’s important to — • Develop expectations of what constitutes good technology integration

  43. Data collection is not evaluation • Rather, it’s an important component • Data must be collected and analyzed within the context of a goal-focused project indicators

  44. More Information • jsun@sun-associates.com • 978-251-1600 ext. 204 • www.sun-associates.com/evaluation • www.edtechevaluation.com

More Related