1 / 16

Evaluation Framework, Design and Data Collection (Part 1)

Evaluation Framework, Design and Data Collection (Part 1). Evaluation Questions and Construction of Evaluation Framework. Evaluation Questions—A Review Formative Summative Question : From WHERE do we derive our Evaluation Questions?. Evaluation Framework.

zada
Download Presentation

Evaluation Framework, Design and Data Collection (Part 1)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation Framework, Design and Data Collection (Part 1)

  2. Evaluation Questions and Construction of Evaluation Framework • Evaluation Questions—A Review • Formative • Summative • Question: From WHERE do we derive our Evaluation Questions?

  3. Evaluation Framework • What are the key constructs that will be measured? • What changes am I hoping to find? • What do I need to know about the strategies and content used in professional development? • Who has the information I need to collect? • How will I gather the information I need from each source? • What tools or processes will I use to gather the information I need?

  4. Evaluation Framework Components • Program Goals • Measurable Objectives • Information/Data Needed • Data Source • Data Collection Strategy • Data Analysis Plan • Timeline

  5. Creation of Evaluation Framework Question: What do we consult to develop the Evaluation Framework?

  6. Putting the Pieces Together Theory of Change Logic Model Evaluation Framework

  7. Questions to ask of your Theory of Change: • What key concepts will be measured? • How will those key concepts be measured? • How will data from those measures be analyzed to construct the answer to your evaluation questions?

  8. Evaluation Design: • Experimental • Quasi-experimental • Descriptive • Naturalistic • Case studies • Mixed-method

  9. Communicating Results • Allow your data to tell your story • Guiding questions or goals provide the basis of your narrative (NOT the survey) • Be clear about what was studied • Be concise! (Always include an executive summary) • Be objective!

  10. Creating Tables • Write table titles that report exactly what is in your table • Label every column and every row • Avoid using too many numbers in the table • Report group sizes • Report whole number percentages • Present data in some sort of order (i.e., low to high)

  11. Quantitative Data: Example

  12. Spring 2008 Survey of MD Online IEP Users • Training Perceptions • Trained vs. Untrained • User Perceptions Related to the Student Compass Wizards • Frequency of Use of the Student Compass Wizards • User Perceptions Related to the Searchable VSC • Goal: The Student Compass Wizards within the Online IEP tool will increase the quality of IEPs developed by IEP End Users.

  13. The percentage who agree or strongly agree that the content of the Wizards was helpful in the IEP development process: The percentage indicating that they always or sometimes use the *Goal Wizard when creating students’ goals: Summary of Findings: Spring 2007 to Spring 2008 • * Wizard usage dropped related to the Present Levels and Accommodations Wizards – these Wizards were released at a later date and were not accessible until mid-year.

  14. Summary of Findings: Spring 2007 to Spring 2008 The percentage who agree or strongly agree that the searchable VSC increases access to the general education curriculum for students with disabilities: The percentage who agree or strongly agree that the searchable VSC promotes alignment of students’ goals to the general education curriculum:

  15. Summary • There was a significant increase in the percentage of end users who see the value in the use of the Student Compass Wizards from 2007 to 2008. • The majority of MD OIEP users see the value in the Student Compass Wizards, but may need further PD to ensure proper implementation and increase frequency of use, as well as the perceived value of the student compass wizards. • The level of training that users perceive greatly affects the level of frequency of use. • The searchable VSC is a highly regarded component of MD OIEP that provides teachers with an efficient way for teachers to scaffold instruction to better provide students with disabilities access to the general education curriculum.

  16. Tamara Otto Research and Evaluation Center for Technology in Education tamaraotto@jhu.edu

More Related