1 / 14

Early Warning Systems Early Adopters’ Survey

Early Warning Systems Early Adopters’ Survey. Everyone Graduates Center Johns Hopkins University. Interpretive Summary Highlights of EWS Early Adopters Learning and Sharing Summit Survey, George W. Bush Institute, Dallas, Texas, November 5-6, 2013. EWS Early Adopters’ Survey.

casper
Download Presentation

Early Warning Systems Early Adopters’ Survey

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Early Warning Systems Early Adopters’ Survey • Everyone Graduates Center • Johns Hopkins University Interpretive Summary Highlights of EWS Early Adopters Learning and Sharing Summit Survey, George W. Bush Institute, Dallas, Texas, November 5-6, 2013

  2. EWS Early Adopters’ Survey Source? 2/3 of participants in EWS Summit. Who responded? Wide range of school and district leaders, TA deliverers and data coaches, community organization representatives, and researchers. How long have they had an EWS? • 20% (6-10 years) • 60% (3-5 years) • 20% (0-2 years)

  3. EWS Early Adopters’ Survey Of those who reported, about: • 10 percent used data only once a year. • Nearly 20 percent used data each term or several times a term. • 10 percent used data monthly. • Nearly 20 percent used data weekly. • More than 25 percent reported “other” (these respondents typically worked with multiple schools, with differing frequencies of use.)

  4. EWS Early Adopters’ Survey Who determined indicators and thresholds? Districts, based on district data or district interpretation of research (more than half), and external evaluators (a quarter). What are the primary indicators? • Attendance – nearly 100% • Course data – nearly 100% • Discipline – over 80% • Test data – over 70% • Student status data, demographics – nearly 70%

  5. EWS Early Adopters’ Survey • Approximately 1/3 rarely, sometimes, or often have difficulty with accessing data in the system. • Primary access is for principals and school administrators (70 percent) to teachers (50 percent). • A growth area is access for social workers, external support partners, and community organizations (20 to 30 percent). • Across categories, users are approximately 10 percentage points fewer than those with access. • Only about 1/3 use EWS to understand patterns in performance trends, while more use it to pinpoint individual struggles.

  6. EWS Early Adopters -- Major Uses of EWS • Strategic point of intervention for students (over 80 percent). • Flags students who have not otherwise been identified (over 70 percent). • Affirms/challenges adult expectations (about 50 percent).

  7. EWS Early Adopters’ Survey • From 75 to 50 percent, in descending order, report integrating EWS with academic assessments, social emotional indicators, college readiness and benchmark testing. • Approximately 1/3 report using EWS to understand performance trends, and about a half do this to some extent. • A low percentage (20 to 40 percent in ascending order) report using EWS for profiling student strengths, relating success to the Common Core, or to student work and professional learning community data.

  8. EWS Early Adopters – Growth Areas? • One third report a few or multiple days of training, and one half report a few hours or less. • One third report one or two follow ups or on-going coaching or facilitation, and equal percentages (15 percent) report several times or none. • Providers of training and coaching span a range of internal and external partners.

  9. EWS Early Adopters’ Selected Comments • “Amazing how similar the drive is across the country.” • “Need a design system to parallel a data system from top down to K.” • “Indicators are consistent, but thresholds will vary locally; implement a system with flexibility yet fidelity.” • “Roll data out from the superintendent.” • “Roll it out as a learning process, free to make mistakes, not tied to accountability right away.” • “Develop a culture of data.” • “Bring life to data – think people rather than numbers.” • “Build cultures of trust rather than defensiveness.”

  10. EWS Early Adopters’ Selected Comments • “Align goals for the classroom, school, district and state.” • “Involve all levels of schools from the beginning.” • “Build relationships across every level, and consensus to build systems.” • “Have teams of data coaches and get feedback from schools while teaching how to use data.” • “Train principals and other school leaders.” • “Train teachers in EWS – they have the heart for it, but haven’t been trained to use the data.” • “People in classrooms need support rather than isolation; give time in the day.” • “Change practices and beliefs will follow.”

  11. Have All Been Able to Stay the EWS Course? The quick response is, “No, not yet.” In some states and districts, College and Career Readiness eclipsed EWS. In some states, finances, hardware issues, and poor interface of software, especially of district student information systems with state systems, dealt emerging state EWS a (temporary) set-back. At the same time, it appears that there is continued development of district systems, and a second generation of state systems which have learned from the first generation.

  12. Challenges • Variations in terminology (definitions) and reporting lead to different assessments of the degree of implementation which exists. • Confusion in the purpose for collecting student data (attendance, behavior, course-passing/credit accrual), assessment and accountability, and financial reporting to states (official counts for funding). • Among the states – philosophies, beliefs, hardware, software, interface with district information systems. • Districts of varying sizes have varying capacities –technical, financial, and human.

  13. Challenges at the School Level • Accurate, complete and timely data entry • A coach, guide, or other person to initially analyze data • Teams to interpret and act on data • Scheduling to support team meetings • Frequency of team meetings • Building a culture of data use for rapid identification, rapid response, rapid interventions, rapid monitoring, and rapid modification

  14. Sources of Information • “On Track for Success: The Use of EWS Indicator and Intervention Systems to Build a Grad Nation,” November, 2011, http://www.every1graduates.org, and subsequent discussions • “Learning What it Takes,”http://www.ever1graduates.org • “Early Warning Systems (EWS) Early Adopters’ Learning & Sharing Summit,” November 5- 6, 2013 (George W. Bush Institute, Dallas, TX, sponsor)http://www.every1graduates.org/early-warning-systems-early-adopters-learning-conference/

More Related