1 / 45

New York State Professional Development Grant

New York State Professional Development Grant. Taking Advantage of Capacity: Salvaging Evaluations and Providing Models of Effective Practice Presenters: Matt Giugno – SIG/SPDG Project Director Wilma Jozwiak – SIG Statewide Coordinator Laura Payne-Bourcy – SIG/SPDG Project Evaluator.

Download Presentation

New York State Professional Development Grant

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. New York State Professional Development Grant Taking Advantage of Capacity: Salvaging Evaluations and Providing Models of Effective Practice Presenters: Matt Giugno – SIG/SPDG Project Director Wilma Jozwiak – SIG Statewide Coordinator Laura Payne-Bourcy – SIG/SPDG Project Evaluator

  2. SIG & SPDG Service Regions New York City Mid-State SL FE Mid-West Hudson Valley CL JL East Long Island M1 OW HA WA M O West ON O M M2 HE E1 OS OC LG CO AS RC TS OD DC E2 CE BD ST UL SU DU Bronx Manhattan WS PW OU Queens RO Brooklyn West (including Staten Island) Brooklyn East WE ES NA 7/14/06

  3. One Type of New York

  4. Another Type of New York

  5. Our Purpose Today: • Talk about the realities that intrude on best plans, and what we did to address our realities in the SIG Grant • Talk about how we relied on existing capacity to salvage outcomes • Talk about how our SIG experience informed our decisions in developing our SPDG project

  6. The NYS SIG Initiative goals were to: • Reduce achievement gap between special and general education students in high and low need schools. • Reduce or eliminated the disproportionality of language and ethnic minority students in classification and placement practices.

  7. New York State SIG Organizational Model LEA Special Education Training and Resource Centers (SETRCs) Regional School Support Centers (RSSCs) Institutions of Higher Education (IHEs) Higher Education Support Center (HESC) Special Education Quality Assurance (SEQA) State Improvement Grant VESID: Resources, TA, Oversight

  8. Two Areas of Intended Impact: NYS SIG resources and partnerships were designed to address needs of: • Inservice Teachers and • Pre-Service higher education faculty

  9. Changing Landscape During the five years of SIG, evaluation design and methodologies had to respond to: • External shifts/expectations/needs (OSEP) and • Concurrent internal programmatic changes/shifts (VESID) Some of these shifts were anticipated and were worked into the original design, and some were not….

  10. Challenges . . . • Practice: • Large geographic distances between targeted schools served by SIG Teams. • Roles: • Shifts in thinking about responsibilities and roles of SIG Teams and RSSC/SETRC partners. • Partnerships: • Difficulty engaging parent organizations. • Programming: • Introduction of new program components.

  11. Challenges . . . • Time needed to embed change: • Grants initially intended for two years were extended to three and four. • Logistics of grant awards: • Changes to NYC district administrative structures. • Reporting: • Institution of new achievement reporting mechanisms.

  12. Challenges . . . • Alignment: • Degree of ‘match’ of project goals to State Performance Plan indicators. • Accountability/Rigor Part 1: • Development of Federal Performance Measures. • Accountability/Rigor Part 2: • ‘Collective call’ to utilize scientific and evidence based practice.

  13. Changing expectations require a response that is both programmatic and evaluative in nature.

  14. The NY SIG Responsive ModelLife in a changing landscape . . .

  15. Post-SIG analysis: Where did we still struggle? New York State used the experience and challenges of five years of SIG to develop retrospective SIG program and evaluation design activities. Some of those challenges and experiences included:

  16. Limited opportunities for programmatic response to changing OSEP requirements due to stage of program implementation (year 4 of 5). • ‘Newness’ of SPP and subsequent lack of data (particularly involving IEPs and student outcomes, transition etc). • Identifying impact on schools and students in NYC amidst changes in NYC educational structure. • Identifying the evidence base of said activities “after the fact”.

  17. Responding to Changing Expectations: OSEP requirements for performance measures

  18. New requirements for evidence • Identify and collect evidence of: • replication of scientific and evidence based practices and proportion of personnel using these practices, • sustainability of efforts, and • alignment with State Performance Plan.

  19. Program Responses: Performance Measures SPDG Mind Map 1. Effective Practice: more on that later…. 2. Replication: professional development, matching, regional & state facilitation. 3. Sustainability: mentoring, symposia, clearinghouse. 4. SPP: more on that soon…….

  20. Evaluation Responses: Performance Measures One solution was incorporation of data collection into program design and activities  co-design. • Exploration of Evidence Based Practices • Regional focus groups. • Analysis of worksheet products.

  21. Replication • Nomination forms: description of practice, evidence of effectiveness. • Validation protocols: practice exploration, collection of evidentiary data for student outcomes, educator practice, school outcomes. • Regional and state facilitator Documentation & Reporting criteria. • Implementation tracking system (to be developed). • Participant interviews (to be conducted).

  22. More Evaluation Responses • Sustainability • Targeted site pre and post survey (to be developed). • Effective Practice site post survey (to be developed).

  23. Program Responses: SPP All of the objectives and activities undertaken as part of SPDG will be carried out under the framework of the SPP. The relevant project phases (as per the developed program logic model and mind map) include:

  24. selection • target site selection based on lack of SPP attainment) • site matching • effective practice sites matched based on expertise in specific SPDG goal areas and SPP indicators for which the target sites have demonstrated need)

  25. implementation • professional development and ongoing technical assistance • individualized to target site needs • as per validated scientific and evidence based practice aligned to the SPP), and • documentation: • more to come…..

  26. Evaluation Responses: SPP Documentation: Capturing outcomes via NYSED Quality Improvement Process (QIP) reporting using SPP as the core framework for analysis and reporting.

  27. SPP Indicators • SPP indicators considered for initial data collection: • graduation and dropout, • achievement, • suspension and expulsion, and • Placement: Least Restrictive Environment • (SPP Indicators 1-6 of 20).

  28. Not all of these indicators were markers in the original SIG Project design: Why these six SPP indicators? • Because NYS: • Is currently collecting baseline information for all districts for SPP 1-6, • Is able to make a determination as to which schools are not meeting standards, • Has identified with schools need further assistance.

  29. Evaluation Procedures • SPP data analyzed annually for participating targeted sites. • Provisional outcome data plan: • Year One: baseline and treatment data for SPP 1-6. • Year Two: baseline SPP 1 – 20, treatment 1-6. • Year Three and beyond: baseline and treatment SPP 1 – 20.

  30. Capturing Impact of SIG in New York City: A Retrospective We needed to better understand SIG work in NYC, including: • technical assistance conducted, • collaboration between UFTTC and targeted school • impact on systematic reform of schools, • parental involvement, and • student outcomes.

  31. System Complexity number of schools involved + SIG work woven into broader UFTTC improvement efforts = how to tease out impact?

  32. Evaluation Design Effort: • Review UFTTC SIG documentation, • Review NYBOE Quality Review Reports for select sites, • Interview UFTTC SIG Coordinator, • Interview key UFTTC Field Liaisons, • Interview key UFTTC site staff. Effect: • Interview key UFTTC site staff, • Interview select school staff at some sites, and • Review of available performance data.

  33. Evaluation Framework • Implementation • Describe the work. • Explore involvement of school leaders. • Analyze teacher responses. • Consider salience and prioritization of approaches. • Describe continuous improvement and mid-course corrections.

  34. Analysis of Process and Planning • Communication: teachers and administrators. • Curriculum and instruction. • School policies and/or school functioning. • Barriers. • Analysis of Impacts • Student outcomes. • School outcomes.

  35. Evolution of Effort:Identifying and Implementing Effective Practices New Directions from state and federal levels require: • Building educator capacity (skills and knowledge) • Needed to implement scientifically or evidence-based practices for children with disabilities.

  36. Some Advantages: • Evaluation redesign activities provided reflection for schools resulting in increased capacity • Needs for redesign in SIG emphasized the need for “contingency planning” at the outset of the SPDG

  37. The Segue to SPDG: How we responded to our lessons in developing our SPDG Supporting Successful Strategies to Achieve Improved Results: The S3TAIR Project

  38. Taking Advantage of Lessons Learned Lesson: Difficulty in identifying data supporting impact • Response: Build identification of effective practice requirements into grant applications and site selection processes. Lesson: Availability (or lack) of data resulted in changes to cohort size • Response: Embed strategies for collection of data directly from schools and/or regions, and strengthen district reporting requirements.

  39. Lesson: Demands for greater accountability and research to practice implementation • Response: Build capacity identification and utilization into the project plan, taking advantage of existing mandates for data collection and analysis • NCLB • SPP • NYS Contract for Excellence

  40. Lesson: Experience from our Special Education Quality Assurance field work. High need, low performing districts consistently lacked effective practices in one or more of the following areas: • Reading instruction/literacy acquisition, • Positive academic and behavior interventions and supports, and • Implementation of effective special education programs and services • Response: These areas will be targeted for intervention in the NYS SPDG S3TAIR Project

  41. Lesson: Even with skilled coaching, districts don’t always identify the most effective strategies • Response: support will be provided in: • Analyzing data and identifying key issue • Identified school improvement activities will be channeled to evidence based interventions • Implementation and evaluation of evidence based practice, including effective implementation practices (National Implementation Research Network)

  42. Lesson: Our high risk districts need models they can identify with to move from research to practice. We knew that: • High need districts want an implementation model in the state and in communities with similar characteristics. • Good examples exist of districts whose school improvement efforts have resulted in sustainable outcomes. • Districts doing good things are often too busy making it happen to talk much about it • Response: Design of NYS SPDG S3TAIR Project

  43. Elements of S3TAIR • Identify and provide small grants to districts whose evidence based effective practices have resulted in good outcomes for students with disabilities: • District funding will support collaboration with S3TAIR Regional Field Facilitators to document the practices • District funding will also support mentor relationships with targeted high need districts • Quality Indicator tools for implementation are currently under development by the VESID Special Education Training and Resource Center network

  44. Fund school improvement efforts for targeted high need districts • Support field based regional staff (Regional Field Facilitators) who will: • Support relationship development with Effective Practice Mentor Schools; • Collaborate with other VESID funded TA networks to provide professional development as appropriate; and • Document the school improvement experience of the funded districts for the Clearinghouse.

  45. Continue our partnership with preservice preparation programs through the Higher Education Support Center/Task Force for Quality Inclusive Schooling: • Regional Task Force groups will identify promising practices regionally and nominate them for statewide validation. • IHEs will focus efforts on increasing preservice and inservice capacity in the identified areas. • The HESC will continue to support Partnership Grants for IHE/LEA collaboration on school improvement.

More Related