1 / 24

Bridging the Gap in Hudson

Bridging the Gap in Hudson. September 2007-June 2008. July. District wide student achievement team was introduced to Focus Monitor. Each building was asked to go back and develop a report regarding demographic information and student achievement. August.

Download Presentation

Bridging the Gap in Hudson

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Bridging the Gap in Hudson September 2007-June 2008

  2. July • District wide student achievement team was introduced to Focus Monitor. • Each building was asked to go back and develop a report regarding demographic information and student achievement.

  3. August • The District wide Student Achievement Team met with the facilitators from the State Department of Education. • They spoke about the Focus Monitoring Process and the expectations for two years. • The members of the District wide Student Achievement Team provided an overview of the Focus Monitoring Process with Staff at a full staff meeting in their own buildings at the beginning of the 2007/08 school year.

  4. Early Fall • NECAP: Each building conducted information sessions to explain the break down of the test at each level. • The staff discussed test prep strategies and how they should be included in all content areas in addition to practicing the NECAP released items. Also provided the NECAP Test Taking Tips for each level to look at. • Guidance and Administration met with all test proctors to review NECAP manual and test security. Active proctoring was discussed as a group. All proctors were encouraged to circulate throughout the testing session. • Administration and Special Education Teachers attended Alt. training.

  5. Fall • Met with the building level data team to review students who were not proficient on prior NECAP. • Review accommodations that were recommended for each individual student. • Team reviewed all previous assessments for each individual student. • Review of current supports for individual students. • Team identified who delivered the support for each student.

  6. Fall • Met with the building level data team to review data driven decision making through the use of data driven dialogue. • Instruct teams on what data driven decision making is and how to use data driven dialogue when planning for students and their needs.

  7. Fall • Met with each grade level team to overview materials • Class profiles for the class taught and measured by NECAP for each teacher to look at their current program. Any areas of concerns? • Individual profile sheets for each individual student identifying strengths and weaknesses from NECAP by the subcategories. • Subcategories Summary Sheet to help cluster support groups.

  8. Throughout the Year • Each building utilized their building level data team to focus the work. • Teachers continued to practice test questions throughout the year. Strategies have been incorporated with goal setting. • Updated staff on the progress that district assessment team had completed through focus monitoring. Review all materials from performance tracker with each grade level to have the teachers identify predictions, observations and inferences each team comes up with when reviewing the NECAP results.

  9. Share Best Practice with Staff • Reviewed best practice theory and how it applied to them. • Viewed what research identifies for components of an effective scheduling practice.

  10. Focus Monitor Assessment Teams • Each building utilized a building level data team. • The District Assessment team also meet regularly to work on root cause analysis. • The following themes or questions emerged.

  11. Curriculum • Questions emerged regarding special education delivery of services and curriculum • There is a need to create common language / common curriculum aligned to GLEs/GSE for each grade level. • A need to develop course outcomes at each grade level to ensure to vertical and horizontal alignment of our curriculum.

  12. Instruction • Consistent implementation of research-based instructional strategies needed (RBT) • Deficits in understanding vocabulary were noted • Universal and consistent implementation model (awareness and documentation) • Lack of clear definition/understanding of differentiated instruction • Is an RTI approach used consistently within the district • Increase instruction time/ too many interruptions • Are SMART (Specific, Measurable, Attainable, Results-based, and Time-bound) Goals used in the instructional planning processes

  13. Leadership • Communication and decision making are areas of concern • Looking at the communication system in place • Are the building using the same information systems to review students performance • Does attendance or attitude play a part in our scores? • Need more input for decision making/ bottom up rather than top down • Culture Change needed to sustain this effort

  14. Data Collection • Gates MacGinitie Reading (Elementary Only) • Everyday Math Assessments • Math screeners (add/sub, mult./division) • Writing Prompts • NECAP • Pyramid Interventions Surveys • Attendance rate • Free/Reduce lunch

  15. IEP Eval Process • Inform large group DINI/FM process • Help focus on IEP is developed (goal- curriculum link) • Development of Professional Learning Community • Increase collaboration between general education and special education • Look at Progress Monitoring • Consistency in building and district goal writing process • Align IEP with the DINI plan

  16. Scheduling • Set up a schedule for next year to allow more blocks of time for teachers to schedule a 90 minute block. • Set up a specialist schedule where all grade levels are at specials with others at their grade level allowing for collaboration. • Make sure that the primary students are going to specials in the pm, to allow reading instruction to take place in the am. • Set up solid 90 minute blocks with support time identified outside the interrupted time. • Set up a schedule for next year to allow more blocks of time for teachers to schedule Language Arts. • Shortened homeroom by fifteen minutes. • Added a half hour to our classes. • Set up some extended periods of time to allow for more individual direct instruction and review. • Continue to work on a master schedule to allow for consistency within the Language Arts block.

  17. Next Steps • All elementary schools will utilize an uninterrupted 90 minute block for reading. • Middle school continue to look for ways to accrue more time. • Adopting new research driven reading series focused on differentiation of instruction. • Scott Foresman Reading Street for Elementary • Read 180 for Middle School • NWEA Maps assessment will be implemented throughout the district.

  18. NWEA Maps Assessment • This project will provide the district with a comprehensive formative and summative evaluation system for students in grades 1-12 in reading/language arts and mathematics. The district may choose to add on a science assessment module and target specific grades at a future date.

  19. NWEA Maps Assessment • Measures student growth and achievement based on the district’s and the states frameworks. • Maps provide a vertical alignment for accurate determination of the instructional level and academic growth history for each student. • Maps testing will be scheduled for 2x next year. • Grades from 1-12 will participate in the pilot next year.

  20. The District will measure success of this project with the following SMART (Specific, Measurable, Attainable, Results-based, and Time-bound) metrics: • End of the year review of the assessment program implementation, effects of assessment results on curriculum adaptation, professional development and instruction as it impacts student growth.

  21. NWEA (Map Testing) Planning and Implementation

  22. * CRF ( Class Roster File) Names of all students ID Numbers of all students Names of all teachers who will be able to access student results. *CRF (Class Roster File) Reports generated according to these files

  23. Should we pilot at one school to see how it goes? To work out quirks. And then test in the Spring. • Is test window for the district, or does each school have it’s own window? (three weeks is identified as test window) • How will testing effect students, other testing and NECAP? This may be too much on the kids. Will just finishing Map testing hurt our NECAP scores.

More Related