1 / 40

Making MAP More Meaningful Winter 2008

Making MAP More Meaningful Winter 2008. David Dreher, Project Coordinator Office of Accountability Highline Public Schools. Overview. Recap of Predicting 2008 WASL Examining 2008 WASL Predictions Moving Forward to 2009. RECAP WERA Spring 08 Presentation.

pahana
Download Presentation

Making MAP More Meaningful Winter 2008

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Making MAP More MeaningfulWinter 2008 David Dreher, Project Coordinator Office of Accountability Highline Public Schools

  2. Overview • Recap of Predicting 2008 WASL • Examining 2008 WASL Predictions • Moving Forward to 2009

  3. RECAP WERA Spring 08 Presentation • Predictions Released November 2007 • A “best guess” about each student’s performance on the upcoming WASL based on prior MAP and/or WASL performance • Intended Uses • Provide building staff with a level of risk for not meeting WASL standard. • School- and District- level 2008 WASL “forecasts” • Theory: Putting MAP scores in context with WASL scores will make MAP more meaningful.

  4. Example of Projection and Prediction7th Grade Student in Reading

  5. WASL Prediction Range • Constructed using the SEM values reported in the 2001 WASL Technical Reports. • Predicted Range = Predicted WASL Score +/- SEM

  6. Interpreting Predictions • If the prediction range is: • Entirely below 400 (ex.: 380-396): student has less than a 20% chance on the WASL this spring unless we accelerate their learning. • Straddles 400 (ex.: 396-410): student has basically a coin-flip chance on the WASL, even if their prediction is above 400. • Entirely above 400 (ex.: 408-424): student has more than an 80% chance on the WASL in the spring, IF they continue to progress.

  7. NWEA’s MAP/WASL Alignment StudyReleased January 2008

  8. NWEA’s MAP/WASL Alignment StudyReleased January 2008

  9. Spring 2008: WASL happened . . .Late Summer 2009: WASL results arrive!

  10. Examining the Predictions • Reliability Analysis • Repeated the “Backward Look” analysis • “Within Group Look” • Analysis of “Exceptional” Performances • Predicted Level Analysis

  11. “Backward Look”: Math % = Actual Met / Predicted to Meet Predicted to Meet = Predicted WASL score of 400 or better.

  12. “Backward Look”: Reading % = Actual Met / Predicted to Meet Predicted to Meet = Predicted WASL score of 400 or better.

  13. “Within Group”: Math % = Actual Met Within Group / Total Number In Group

  14. “Within Group”: Reading % = Actual Met Within Group / Total Number In Group

  15. Questions/Comments • Procedures for making predictions • Results of reliability analyses • What about our theory behind doing this? • “Putting MAP scores in context with WASL scores will make MAP more meaningful.”

  16. “Meaningful”: Depends on Who You Ask • My experience talking with the people who work directly with the kids suggests that the strength of our ability to assess the risk level of their students doesn’t impress them. • “You don’t need a weatherman to tell you which way the wind blows” Bob Dylan

  17. Data Creators Office Of Accountability Data Users Principals, Coaches, Teachers What would be more “Meaningful”? • Information that would help determine whether things done to help kids pass the WASL worked

  18. But . . .I don’t really know what schools are doing to try to help kids pass the WASL. . . So . . .how can I find out?

  19. “Exceptional” Performance Analysis • Please see handout • Objective: Start conversations that would increase the flow of information from data users back to us in Accountability • What are your observations of the data?

  20. Expectations for “Above Level” • Students were receiving interventions designed to address skills/knowledge deficits • Students were receiving interventions designed to familiarize them with WASL format • Students benefited from actions taken by the school to improve the WASL testing environment • Your ideas?

  21. Expectations for “Below Level” • Students were ELL or SPED • Students were chronically absent or highly mobile • Students did not take the WASL seriously • Your ideas?

  22. Moving Forward • Predictions simplified: Use BSI designations only • Raising awareness and understanding of NWEA’s Alignment Study • Increase understanding of NWEA goals and how to interpret goal-level results • Investigate the possible use of MAP data in evaluation of interventions, initiatives, and programs

  23. Data Creators Office Of Accountability Data Users Principals, Coaches, Teachers Continue to Solicit Input

  24. Contact Information David Dreher, Project Coordinator Office of Accountability Highline Public Schools www.hsd401.org 206-433-2334

  25. What they said . . .

  26. What is MAP • Measures of Academic Progress • Developed by the Northwest Evaluation Association • Norm-referenced assessment • Computerized and adaptive • Performance is reported as a RIT score • The RIT Scale • Uses individual item difficulty values to estimate student achievement • A RIT score has the same meaning regardless of grade level • Equal interval scale • Highline Public Schools • Three testing windows per year (Fall, Winter, Spring) • Test students in the areas of math and reading • Test students in grades 3-10

  27. The Needs of the Data User • Building staff were saying things like . . . • “How can we use MAP data to help us make decisions?” • “How do MAP and WASL performance compare?” • “I want to know what a student’s history is with MAP.” • “What is a RIT score?” • “Giving me a RIT score is like telling me the temperature in Celsius!”

  28. Making The Predictions • Snooped and found the best indicators of WASL success • Applied linear regression models to generate WASL scores for each student • Examined the predicted WASL scores

  29. Projecting MAP to Spring • For the models with “Projected MAP” as one of the factors individual student performance on MAP in the Spring of 2008 was projected. • The amount of expected growth added to a student’s Highest MAP score came from NWEA’s Growth Study

  30. Snooping (Reading)R-Values

  31. Snooping (Math)R-values

  32. What we learned by snooping. . . • Correlations were generally good. • Reading R-value range: 0.711 - 0.835 • Math R-value range: 0.603 - 0.921 • Correlations in math were stronger than in reading. • “Highest MAP” consistently correlated better than any single MAP score. • Correlations were generally strongest when Highest MAP and WASL 2006 factors were combined.

  33. Regression Models For students with both MAP and 2006 WASL scores (~95%) WASL 2007 = b0 + b1*Highest MAP + b2*WASL 2006 For students that only had MAP score(s) (~3%) WASL 2007 = b0 + b1*Highest MAP For students that only had WASL 2006 score (~2%) WASL 2007 = b0 + b1*WASL 2006 Where: Highest MAP = The student’s highest score on MAP from the Fall 2006, Winter 2007, or Spring 2007 windows. Typically Spring 2007. WASL 2006 = The student’s raw score from the 2006 WASL Spring testing.

  34. Prediction Models For students with both MAP and 2007 WASL scores WASL 2008 = b0 + b1*Projected MAP + b2*WASL 2007 For students with only MAP score(s) WASL 2008 = b0 + b1*Projected MAP For students with only WASL 2007 score WASL 2008 = b0 + b1* WASL 2007 Where: Projected MAP = Projected Spring 2008 MAP score based on the student’s highest score on MAP from the Winter 2007, Spring 2007 or Fall 2008 windows. WASL 2007 = The student’s raw score from the 2007 WASL Spring testing.

More Related