1 / 58

Dr. Michael Faggella-Luby Dr. Natalie Olinghouse Dr. Michael Coyne

How Do I Know If They Are Getting It?: Measuring Student Responsiveness to Reading and Writing Instruction. Dr. Michael Faggella-Luby Dr. Natalie Olinghouse Dr. Michael Coyne. Research

telyn
Download Presentation

Dr. Michael Faggella-Luby Dr. Natalie Olinghouse Dr. Michael Coyne

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. How Do I Know If They Are Getting It?: Measuring Student Responsiveness to Reading and Writing Instruction Dr. Michael Faggella-Luby Dr. Natalie Olinghouse Dr. Michael Coyne

  2. Research Conduct school-based research on developing and evaluating evidence based practices in literacy, behavior supports, and assessment Translating Research to Practice Support schools, districts, and states in adopting, implementing, and sustaining evidence based practices

  3. Overview • Introduction • Results from CT Reading Summit • RtI (SRBI) • Assessment • Early Reading • Adolescent Reading • Writing • Summary & Discussion

  4. CT Reading Gap

  5. CT Reading Gap

  6. Overview of Response to Intervention (RtI)

  7. Original logic: Public health & disease prevention (Larson, 1994) • Tertiary (FEW) • Reduce complications, intensity, severity of current cases • Secondary (SOME) • Reduce current cases • of students with literacy difficulties • Primary (ALL) • Reduce new cases • of students with literacy difficulties

  8. RtI: Defining Features

  9. RtI Applications Across Domains of Student Success

  10. Four Purposes for Assessment Screening- Assessments that are administered to determine which children are at risk for reading/writing difficulty and who will need additional intervention. Diagnosis - Assessments that help teachers plan instruction by providing in-depth information about students’ skills and instructional needs. ProgressMonitoring - Assessments that determine if instruction or intervention is enabling students to make adequate progress. Evaluation - Assessments that provide a bottom-line evaluation of the effectiveness of the reading/writing program.

  11. Early Reading Assessment

  12. Oral Reading Fluency CBM: Curriculum Based Measurement (http://dibels.uoregon.edu) (http://aimsweb.com) Counting the number of correct words while a student reads aloud from grade-level text for 1 minute. “Because oral reading fluency reflects the complex orchestration of many different reading skills, it can be used in an elegant and reliable way to characterize overall reading expertise.” (Fuchs, Fuchs, Hosp, & Jenkins, 2002) Measures of oral reading fluency are highly correlated with reading comprehension in the primary grades.

  13. 89 • Total Words Read: • Errors: • Words Read Correctly: 4 85

  14. Oral Reading Fluency Relationship between reading fluency and comprehension: He had never seen dogs fight as these w___ c____ fought, and his first ex___ t__t him an unf___able l__n. It is true, it was a vi__ ex____, else he would not have lived to pr__it by it. Curly was the v___. They were camped near the log store, where she, in her friend__ way, made ad___ to a husky dog the size of a full-__ wolf, the ___not half so large as __he. __ere was no w__ing, only a leap in like a flash, a met__ clip of teeth , a leap out equal__ swift, and Curly’s face was ripped open from eye to jaw. (London)

  15. Oral Reading Fluency A student who does not read fluently • Even if she has good understanding, she will have difficulty with reading comprehension • If she also has difficulty with understanding, she will have even more difficulty with reading comprehension A student that does read fluently • If she has good understanding, her reading comprehension will be good • If she has difficulty with understanding, she will have difficulty with reading comprehension

  16. ORF: Normative Data

  17. ORF: Benchmark Goals

  18. 50% Low Risk (>77) 30% Some Risk (53-76) 20% High Risk (<53) ORF: Screening

  19. ORF: Screening

  20. ORF: Screening

  21. A change in intervention ORF: Progress Monitoring Aimline

  22. ORF: Evaluation First Grade Reading Outcomes Before School Changes

  23. ORF: Evaluation First Grade Reading Outcomes After School Changes

  24. Benefits & Limitations

  25. ORF: Summary • Screening – yes • Progress monitoring – yes, for code-based skills • Diagnosis – no • Evaluation – yes, but more for internal evaluation

  26. How do you make ORF useful? • Coordinate administration at a school wide level • Assess all students 3 times per year, assess students who are at risk more often • Organize and manage data at the building level • Supplement ORF with other diagnostic measures • Use consistent data-based decision rules to make instructional decisions (fail safe procedures) • Screening (who gets intervention) • Progress monitoring (how to intensify intervention)

  27. Adolescent Reading Assessment

  28. Gates-MacGinitie Reading Tests • Definition • Group administered vocabulary and reading comprehension achievement assessment • Characteristics • Word Meaning & Passage level comprehension • Norm referenced, (PR, GE, etc). • Developmentally appropriate K-Adult Measures • Two Forms (S&T) for pre- & posttesting • 55 minutes (most levels) • Multiple scoring options

  29. Benefits & Limitations

  30. Summary • Screening – yes, but must be administered in a timely and reliable manner • Progress monitoring –no • Diagnosis – yes, with regard to adding reading achievement information • Evaluation – yes, allowing norm comparisons and grade equivalent scores

  31. How do you make the Gates-MacGinitie useful? • Must be scored and used to make instructional placement decisions in a timely manner • Data should be clearly organized and easily summarized for student grouping and instructional decision making • Data should be available to classroom teachers and a school-wide data team • Ensure standardized administration and scoring to be reliable and valid

  32. Cloze • Definition • A timed sentence level reading comprehension measure in which every nth word is removed • Example: AIMSweb Maze • 3-minute individual OR group administration • Standardized and Normed* • Multiple-choice for easy scoring • Includes grade-level Fall, Winter and Spring benchmarks

  33. Example Maze

  34. Benefits & Limitations

  35. Summary • Screening – yes, but perhaps not in universal assessment • Progress monitoring – yes, but creating usable system for interpretation/presentation is essential • Diagnosis– maybe, with regard to adding additional reading ability information • Evaluation– yes, allowing norm comparisons, overall growth picture

  36. How do you make CBM Maze useful? • Decide what kind of information you hope to glean from measures, whom to assess, and how often • Organization of materials, administration schedule, data collection, scoring and interpretation is essential • Data should be available to classroom teaches and a school-wide data team in a timely manner

  37. KU Descriptive Study Measures Assessment Area Measure Alphabetics Woodcock Language Proficiency Battery- R Decoding WLPB-Revised: Word Attack Word identification WLPB-Revised: Word Identification Fluency Pace/Rate Test of Word Reading Efficiency (TOWRE) Phonetic Decoding Efficiency (TOWRE) Accuracy Gray Oral Reading Test-4 (GORT-4) Vocabulary Expressive Peabody Picture Vocabulary Test III Reading WLPB-R Reading Vocabulary subtest Comprehension Reading Comprehension WLPB-R Passage Comprehension subtest Listening Comprehension Gray Oral Reading Tests-4 (GORT-4) WLPB-R Listening comprehension subtest The Learner Motivation The Motivation for Reading Questionnaire (MQR) Hope The Hope Scale for Motivation Achievement Kansas State Assessment (KSA)-Reading Subtest

  38. Reading Component Profile ∆ Proficient ◊ ASRS 115 110 105 100 95 90 85 80 75 70 ∆ ∆ ∆ ∆ ∆ ∆ ∆ ∆ ∆ ∆ ∆ Mean Standard Scores ◊ ◊ ◊ ◊ ◊ ◊ ◊ ◊ ◊ ◊ ◊ ALPHABETICS FLUENCY VOCABULARY COMPREHENSION Word ID-Word Att Rate-Accuracy-SWE-PDE PPVT-WLPB Rd-Vocab-List CompPass Comp-Rdg Comp Scores from the WLPB-R, GORT, TOWRE, PPVT, Sub tests *Statistically Different

  39. Writing Assessments

  40. Writing Assessments-Overview • Text level • Objective measures (CBM, research) • Number of words written • Number of correctly spelled words • Correct word sequences • Subjective measures (portfolio, norm-reference tests, large-scale assessments, research) • Holistic • Analytic (e.g., 6 Traits) • Primary Trait • Sentence/word level (norm-referenced tests, research, large-scale assessments) • Word spelling • Editing • Sentence fluency (syntax + production); Sentence combining • Vocabulary

  41. Benefits & Limitations of Different Writing Measures

  42. Benefits & Limitations of Different Writing Measures

  43. Benefits & Limitations of Different Writing Measures

  44. Norm-Referenced Tests of Writing • Test of Written Language (TOWL-3) • Woodcock-Johnson III (WJ III) Writing Cluster • Wechsler Individual Achievement Test-2 (WIAT-2) Written Language Composite

  45. Typical Characteristics of Norm-Referenced Tests of Writing

  46. Benefits & Limitations of Norm-Referenced Tests of Writing

  47. Summary • Screening-no, tests can be lengthy to administer and score • Progress monitoring-no, tests can be lengthy to administer and score • Diagnosis-yes, primary use of norm-referenced writing tests • Evaluation-no, tests can be lengthy to administer and score

  48. How do you make norm-referenced writing tests useful? • Choose an instrument that provides the type of information you need-different writing tests measure different skills • Ensure standardized administration and scoring to be reliable and valid • Carefully follow the scoring guidelines in the manual • If multiple people are scoring, provide training sessions • Include other assessments of writing to provide a complete picture of a student’s writing abilities

  49. Large-Scale Writing Assessments • 49 states currently have direct writing assessments • Grades 3-5: 37 states • Grades 6-8: 40 states • Grades 9-12: 36 states • Typical format • Oral, written, or pictorial prompt that introduces a topic for the written response. • Most often a ‘stand-alone’ test. If combined with a subject area, writing quality often is not measured

  50. Typical Characteristics of Large-Scale Writing Assessments

More Related