1 / 59

Building and Deploying an Early Warning System in Wisconsin

Building and Deploying an Early Warning System in Wisconsin. Jared Knowles Research Analyst Wisconsin Department of Public Instruction. Statewide PBIS Network Conference Wisconsin Dells, Wisconsin August 20 th , 2013. Agenda. Principles for a Dropout Early Warning System

lavi
Download Presentation

Building and Deploying an Early Warning System in Wisconsin

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Building and Deploying an Early Warning System in Wisconsin Jared Knowles Research Analyst Wisconsin Department of Public Instruction Statewide PBIS Network Conference Wisconsin Dells, Wisconsin August 20th, 2013

  2. Agenda • Principles for a Dropout Early Warning System • Building a Statewide DEWS • Piloting a DEWS • Learning from the Pilot

  3. DEWS Refresher • DEWS score calculated using a combination of demographic and student outcome measures to improve accuracy • Attendance, disciplinary events, assessment scores, and student mobility • Student risk is calculated individually for each student • Students are classified as at risk if their score crosses a threshold set by DPI; districts can use this or ignore it

  4. DEWS Refresher • DPI early warning system is called the Dropout Early Warning System, or DEWS • DEWS provides a score from 0-100 for current 6th, 7th, and 8th graders • The score represents the rate at which students similar to the current student in previous cohorts graduated • A score of 75 means that 75% of prior students with similar characteristics graduated on time

  5. DPI’s System is in Development • More than 60% of studentswho eventually do not graduate after 4 years of high school can be identified with current data before the start of 7th grade • DPI is working to improve this through better techniques to allow students to be identified earlier and with more accuracy • The system will continually improve with better data, better mathematical models, and more real time results

  6. Classification

  7. Project Plan • DEWS was developed during the 2012-13 school year • Pilot group of 34 schools identified in early 2013 • Pilot materials delivered electronically in mid-April 2013; participation in follow-up survey too • Interpretative guide • Student reports for all current 7th graders • School report • School roster • Pilot materials mimic WISEdash, final scheduled for September 2013 rollout in WISEdash

  8. Awareness and Communications • Title I Coordinators • Accountability Trainers • Statewide PBIS Network • CESA Support Network • SSEDAC • School Administrators Alliance • School Counselors Association • WERAC • National Forum on Education Statistics • REL Midwest • Partners at WCER • Department of Children and Families • Members of WISEexplore

  9. DEWS Process LOCAL KNOWLEDGE STATE DATA Teacher / program context Parent input Special circumstances CONTEXT Assessments Demographics Intervention Strategies Attendance Disciplinary Events Mobility Location Student Risk Identification

  10. Pilot Reports

  11. DEWS Pilot Schools • Washington and Lincoln Mid.; Kenosha • Gilmore and Starbuck Mid.; Racine • Deer Creek Inter.; St. Francis • Franklin and Edison Mid; Janesville • Aldrich Middle, Beloit • Toki and James Wright Mid., MMSD • Riverview Elementary; Silver Lake • Oconto Mid.; Oconto • Menominee Indian Mid.; Menominee Indian • James Williams Mid; Rhinelander • Riverview Mid; Barron • Cumberland Mid; Cumberland • Lac du Flambeau Elem; • Lancaster Middle; Lancaster • Tomah Middle; Tomah • River Valley Middle; River Valley • Spring Hill Middle; Wisc. Dells • Waupaca Middle; Waupaca • Roosevelt Middle; Appleton • Webster Stanley Middle; Oshkosh • L.B. Clarke Middle; TwoRivers • Random Lake Mid.; Random Lake • Washington and Edison Mid; Green Bay • D.C. Everest Mid; DC Everest • Colfax Elementary; Colfax • Bloomer Middle; Bloomer • DeLong Middle; Eau Claire

  12. Pilot Reports - Student

  13. Pilot Reports - School

  14. Survey Results • Survey sought to identify the utility of the DEWS reports in relation to existing Early Warning System / identification measures Asked about: • usefulness of DEWS report • usefulness of interpretation guides • desire to have DEWS available • WISEdash usage • Likelihood to use WISEdash if DEWS included

  15. Survey Summary • 18 of the 34 participating pilot schools have responded to the survey so far (52.9%) • 15 of the respondents indicated they “fully reviewed” the results • 3 schools have been interviewed with 3 more interviews scheduled • 5 schools said staff reviewed the reports individually, 11 said staff reviewed them in a group working together

  16. DEWS Overall Valuable!

  17. DEWS Identifies Students Missed

  18. DEWS Does Not Miss Many Students

  19. Student reports are most positive element

  20. The Student Roster is Valuable

  21. School reports are well liked

  22. Most respondents expect DEWS to be used at least annually

  23. Annual Delivery Before School Year is Strongly Preferred

  24. DEWS can drive WISEdash usage

  25. Principals and Student Services Staff Must Have WISEdash access

  26. DEWS Beyond Fall 2013 DEWS as it exists is just a start. Several extensions for DEWS may be desired: • Deeper WISEdash integration? • Communication and professional development to raise awareness and use for informing interventions? • Extend coverage to earlier and later grades • Increase accuracy? • Add college-enrollment as a secondary warning?

  27. All data is fictitious and for demonstration purposes only.

  28. Student Overview All data is fictitious and for demonstration purposes only.

  29. Get More Information All data is fictitious and for demonstration purposes only.

  30. Mobility History All data is fictitious and for demonstration purposes only.

  31. Detailed Assessment History All data is fictitious and for demonstration purposes only.

  32. EWS in a Multi-Level System of Support (MLSS or RtI)

  33. Current and Future Partners Current: • Title I Coordinators • Accountability Trainers • Statewide PBIS Network • CESA Support Network • SSEDAC Future: • WCER / VARC • RtI Center • Pupil Services State Organizations • DPI Divisions and Teams?

  34. In the Works Research Grant with WCER funded through the Institute for Education Sciences (IES) to explore DEWS usage and dropout prevention strategies • Goal is to provide districts an answer to the “What now?” question that DEWS poses • Grant submission in September, notification by February 1, 2014

  35. Extending Grades

  36. Increase Accuracy with New Datasets Inputs • ISES / WSAS / SBAC • Attendance • Discipline • Mobility • Interventions Outputs • Student-specific identification • WISEdash Dashboard • RTI module • Local analysis NOW LATER

  37. Model Types Models Tried: • Probit (winner) • Logit • HLM • k-nearest neighbors (knn) • Gradient Boosted Machine • Random Forests Models Yet Tried: • Cubist • Support Vector Machines • Multivariate Adaptive Regression Splines • Discriminant Analysis • Neural networks • Bayesian Model Averaging Currently a manual process, automation is the next step

  38. Questions and Contact Info • E-mail: jared.knowles@dpi.wi.gov • Web: www.jaredknowles.com • Code: github.com/jknowles • Twitter: @jknowles

  39. Let’s Get Technical

  40. DPI DEWS Features • Free and Open Source Platform • Fully modular • Empirically Derived • Flexible • Extensible

  41. Free and Open Source • A key feature of the DPI DEWS is that it is built on free and open source technologies • It is a series of 5 modules: • Data import • Data recoding / cleaning • Model selection • Prediction • Data Export • It has some pre-requisites to work

  42. Technologies • The EWS is written for the R open-source statistical computing language • It is a series of modular scripts that perform some basic functions and may not be necessary everywhere • Each module expects data in certain formats and returns data in a specific format • This is entirely local to Wisconsin currently, but improvements made during the pilot phase should allow time to generalize it more

  43. Modules

  44. Data Import All data is fictitious and for demonstration purposes only.

  45. Data Import • Extract raw data from an Oracle data warehouse • Extract needs all records for a grade of students from grade 7 to graduation • Extract will be reused to get data on current grade 7 students for prediction

  46. Data Recoding and Cleaning • Data recoding is the only place that decisions are forced on the statistical model • Administrative records need to be reshaped in a way to fit the statistical procedures • Business rules need to be in place to enforce standardization of fields • Example: FRL is coded as “F”, “R”, “N”, “A”, “P” • Need to reduce this to “F” and “N” or “F”, “R”, and “N” • Use business rules from the Strategic Data Project • Enforce some rules to make statistical model easier to fit (grouping categories to increase cell size)

  47. Inputs and Outputs All data is fictitious and for demonstration purposes only.

  48. Model Selection • Fit a basic statistical model regressing a subset of the data on students in 7th grade on an indicator of whether or not they graduated • More variables are added to the model, and the prediction rate of each successive model is evaluated on a test set of data • Finally, when all variables have been exhausted, or the best possible prediction rate has been achieved, the process is stopped • This is repeated for other classes of models / functional forms until the best model from the best of each class is identified

  49. Model Selection • Depending on the data available, the factors included in the model will change, as will their weight in predicting the outcome • The system is flexible to this, so it can expand as new data comes online, and as more longitudinal data is available on cohorts • For now, in Wisconsin, for two cohorts, these factors seem to matter • Assessments — Attendance — Mobility — Discipline • School of attendance

  50. ROC Curve Receiver Operating Characteristic (ROC): A measure of signal to noise in binary classification. http://en.wikipedia.org/wiki/Receiver_operating_characteristic

More Related