1 / 40

Van Hise Elementary School

Van Hise Elementary School. Review of Data School Improvement Process March 3, 2009. Why use data? How should we use it?. Data isn’t meant to replace our knowledge, experience, insights, and intuitions.

tauret
Download Presentation

Van Hise Elementary School

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Van Hise Elementary School Review of Data School Improvement Process March 3, 2009

  2. Why use data? How should we use it? • Data isn’t meant to replace our knowledge, experience, insights, and intuitions. • Data complements each of these, helping us avoid “blindspots” or generalizations that need a more sophisticated understanding. • Data is best used as a source of information that leads to reflection. Numbers are numbers, but their meanings are determined through reflective analysis and thoughtful discussion.

  3. How will we respond to the data we review today? • As we approach each data source, consider your state of mind: what assumptions do you bring to the data? What predictions are you making? • After reviewing sets of data, ask yourself: • What important points seem to “pop out?” • What are some of the patterns and trends that emerge? • What seems to be surprising or unexpected? • Then consider the information that’s missing? What other information should be gathered? In what directions do we need to examine the data in greater detail or from another perspective?

  4. and remember… As we examine the data, there are two tendencies that sometimes occur: • To focus ononlythe negative or the needs that are apparent and to ignore strengths and positive “assets” in the school. • To be offended or get defensive with data that points out needs, challenges, or concerns.

  5. How has the overall enrollment changed across time? 2008-09 marks the highest enrollment level (previous high level was 329 in 1999)

  6. What do we know about our students? ENROLLMENT BY LOW INCOME AT VHES 20% 2008-09

  7. How does VHES’s level of economically disadvantaged students compare to the District and State?

  8. This Year’s Enrollment by Low Income: MMSDSeptember, 2008

  9. This Year’s Enrollment by Low Income: MMSDSeptember, 2008

  10. Data on our students… RACIAL/ETHNIC DIVERSITY AT VHES 6% 9% 22% 63% 2008-09

  11. How does VHES’s diversity of students compare to the District and State? 2007-08 School Year

  12. How did VHES’s student needs compare to the District?2007-08

  13. Other data you may want to look at later: • Mobility rates (whole school and disaggregated by student groups) • Home factors – number of parents in household and highest education level

  14. What do we know about how our students are engaged? ATTENDANCE RATES FOR ALL STUDENTS: VHES AND MMSD ELEMENTARY SCHOOLS All student groups were above the 94% goal last year.

  15. Another indicator of engagement: • Behavior-related data: both suspension data and office referral data.

  16. As we begin looking at measures of learning, we will begin with the SAGE Report Data.- 16 Objectives met the 80% standard. - 7 Objectives were below 80%.

  17. How have our student performed on the PLAA over time? How does this compare to the District average?

  18. How have our student performed on the PMA over time? How does this compare to the District average?

  19. Adequate Yearly Progress Annual Measurable Objectives % Proficient/Advanced

  20. When we consider the “high stakes” test for reading, how did our students perform? How do our Proficiency/Advanced levels compare to the District and State? WKCE – Reading, 2007: Proficiency/Advanced % Criteria that determines a school’s status (AYP): Reading – 74% 3rd - 91.8% 4th - 84.6% 5th - 97.7%

  21. When we consider the “high stakes” test for reading, how did our students perform? How do our Proficiency/Advanced levels compare to peer schools? WKCE – Reading, 2007: Proficiency/Advanced %

  22. When we compare the students that we could instructionally impact (FAY) to Wisconsin schools with similar levels of economic disadvantage, how did we do in bringing our students up to proficiency in reading?

  23. When we compare the students that we could instructionally impact (FAY) to Dane County schools with similar levels of economic disadvantage, how did we do in bringing our students up to proficiency in reading?

  24. Looking at the Six-Trait Writing Sample results, how have our third graders performed over time?

  25. Looking at the Six-Trait Writing Sample results, how did our third graders compare to the district average?

  26. Looking at the Six-Trait Writing Sample results, how have our fifth graders performed over time?

  27. Looking at the Six-Trait Writing Sample results, how did our fifth graders compare to the district average?

  28. When we consider the “high stakes” test for mathematics, how did our students perform? How do our Proficiency/Advanced levels compare to the District and State? WKCE - Mathematics, 2007: Proficiency/Advanced % Criteria that determines a school’s status (AYP): Math – 58%

  29. When we consider the “high stakes” test for mathematics, how did our students perform? How do our Proficiency/Advanced levels compare to peer schools? WKCE – Math, 2007: Proficiency/Advanced %

  30. When we compare the students that we could instructionally impact (FAY) to Wisconsin schools with similar levels of economic disadvantage, how did we do in bringing our students up to proficiency in math?

  31. When we compare the students that we could instructionally impact (FAY) to Dane County schools with similar levels of economic disadvantage, how did we do in bringing our students up to proficiency in math?

  32. Improvement-based school performance measures

  33. Value added measures • Extra WKCE points gained by students at a school on average relative to observably similar students across district • Value added of +3 means students gained 3 points more than the district average • Value added of -3 means students gained 3 points less than the district average

  34. Understanding VA Data • Average student gain on WKCE relative to district average, with adjustments for: • Shape of the test score scale • Gender, race, disability, low-income status, language, parents’ education • Mid-year (November) testing • Patterns in gains from one year to the next

  35. Value Added and Proficiency

  36. Other assessments to look at in the future: Science and Social Studies Test Results (WKCE, Grade 4) Other report card information Six-Trait Writing Results

  37. NMCS’s Special Education Information includes: Placement/Referral Data “Risk Factor” Ratio Least Restrictive Environment

  38. When it comes to measurements of relationships… School Climate Survey Responses from all students (grades 3-5), parents, and all staff. Comparisons to District, to previous year, and internally between demographic groups.

More Related