1 / 36

Form Effects on the Estimation of Students’ Progress in Oral Reading Fluency using CBM

Form Effects on the Estimation of Students’ Progress in Oral Reading Fluency using CBM. David J. Francis, University of Houston Kristi L. Santi, UT - Houston Chris Barr, University of Houston CRESST September 8, 2005. Overview.

Download Presentation

Form Effects on the Estimation of Students’ Progress in Oral Reading Fluency using CBM

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Form Effects on the Estimation of Students’ Progress in Oral Reading Fluency using CBM David J. Francis, University of Houston Kristi L. Santi, UT - Houston Chris Barr, University of Houston CRESST September 8, 2005

  2. Overview • Curriculum Based Measurement (CBM) to Monitor Student Progress and Inform Instruction • Methods • Results • Conclusions

  3. Background • Report of the National Reading Panel (NRP, 2000) highlighted the importance of instruction and assessment in five domains of reading and related skills • Phonemic awareness • Phonics • Fluency • Vocabulary • Comprehension

  4. Background • No Child Left Behind (NCLB) and Reading First (RF) are based on the NRP model of reading acquisition and mastery • RF emphasizes • The five domains, • Three-tier model of instruction, prevention and intervention, • Four purposes of assessment in guiding instruction

  5. Purposes of Assessment • Reading First describes four purposes for assessment in the five domains: • Screening • Diagnosis • Progress Monitoring • Outcome • All in the service of guiding instruction

  6. Progress Monitoring • Monitor student progress toward year-end goals • Provide teachers regular feedback on students’ rate of skill acquisition • Identify students needing modification to current instruction based on low rate of skill acquisition

  7. Progress Monitoring • Essential characteristics • Administer on a regular basis • Brief and easy to administer in the classroom • Provide scores on a constant metric • Predictive of end of year outcomes • Free from measurement artifacts such as practice effects and form effects • CBM has been proposed as having these properties

  8. What is CBM? • Students read connected text for a fixed duration of time, typically one minute • Oral reading fluency (WCPM) is computed and charted as a measure of growth in reading rate • Reading materials range from basal readers to pre-packaged texts

  9. DIBELS • Developed by Good and Kaminski • CBM measure of early reading skills using one minute probes • Included in this study due to • A large number of stories are in place for fluency assessment • Developers’ efforts to equate stories for “readability” • Ubiquitous in RF for PM assessment

  10. Many Strengths • Quick, easy assessment • One minute probe given once a week • Teacher friendly format • Easy to follow directions • Instructionally relevant information • Within grade evaluation of student growth

  11. Why might we expect form effects? • Story construction • Readability formulas are not perfect • Difficult to precisely control text features that affect fluency • Lack of attention to scaling • Stories have been pre-equated for text features • No attempt to empirically equate forms • Assumption that WCPM provides a constant scale

  12. Purpose of Current Study • Examine form effects on DIBELS Oral Reading Fluency (DORF) at single time point in grade 2 • Examine form effects on inferences about growth in DORF over 6 weeks in grade 2

  13. Methods

  14. Setting and Participants • Two schools in HISD • 134 students • 85 from school A • 49 from school B • 69 females • 65 males • Ethnically diverse student populations

  15. Measures • DORF Passages (n=29) • Six passages were randomly selected • Spache readability index average = 2.65 • Range 2.6 to 2.7 • Degrees of Reading Power readability index = 45.67 • Range 44 to 46 • Scale 0 (easy) to 100 (difficult)

  16. Procedures • 3 research assistants administered the probes to all students once every two weeks • Inter-rater reliability of .85 established prior to start of study • Passages administered according to guidelines provided in DIBELS manual • Story order randomly assigned (1 of 6) • Three stories read at baseline • One story read in waves 2-4

  17. Random Assignment of Students to Passages • Each student read three passages at baseline • Design allows estimation story, order, and story by order effects

  18. Despite randomization of students to six groups, group differences in fluency were apparent at baseline • Using a measure of fluency from the Texas Primary Reading Inventory (TPRI), the six groups differed in mean fluency • F(5,118) = 3.98, p < .002 • Means ranged from 47 to 80 WCPM across the 6 groups

  19. Subsequent analyses used TPRI fluency as a covariate • When TPRI fluency is covaried, groups do not differ on any particular form/story. • Note we’re not saying that DIBELS stories are equal, only that for any given story, groups did not differ in performance after controlling for TPRI fluency.

  20. Data Analysis • Analyzed oral reading fluency using mixed model approach to repeated measures analysis of variance using SAS PROC MIXED • Fixed effects Random effects: • TPRI_Fluency(TPRI_story) Story Correlations • DIBELS_Story (1-6) (By Order) • DIBELS Order (1,2,3) • DIBELS_Story by Order

  21. Results

  22. Descriptive Data

  23. Tests of Fixed Effects

  24. Pairwise Differences in LS Means

  25. What about rate of growth? • Real interest in DORF passages is to estimate rate of skill acquisition • Typical Practice • Test Students Every 2 Weeks • Compute a best-fitting straight line through the data • Students with low rates are targeted for intervention or adjustments to instruction

  26. Descriptive Data over 4 Waves

  27. Tests of Fixed Effects

  28. Estimated Growth Rates

  29. LSMean Fluency by Wave and Group

  30. Conclusions

  31. Conclusions • Form Effects in PM assessments must be addressed if teachers are to: • Form valid inferences about student progress • Target the right students for intervention and supplemental instruction • The problem is not one of reliability in terms of low correlation between alternate forms • The problem is one of inconsistency in scaling across forms

  32. Conclusions (cont.) • These form effects adversely affect the reliability and validity of slope estimates. • The problem is not unique to DIBELS, nor to CBM, but it has been ignored in this literature. • CBM was chosen for this study because of its popularity for PM assessment. • The CBM literature implies that fluency (WCPM) inherently provides a constant scale.

  33. For WCPM to provide a constant scale, forms must be parallel • A more viable solution is to remove “form effects” through scaling of the raw ORF scores • We have to develop a scale score that takes “form difficulty” into account • One potential solution is equipercentile equating

  34. Progress Monitoring • Solution is to empirically equate forms and develop a scale score metric that factors out form differences • Because of the large number of forms in use, we propose a “FEDEX” model that equates all forms to a single standard form based on percentiles

More Related