Effective Formative Evaluation: The Underlying Logic… - PowerPoint PPT Presentation

slide1 n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Effective Formative Evaluation: The Underlying Logic… PowerPoint Presentation
Download Presentation
Effective Formative Evaluation: The Underlying Logic…

play fullscreen
1 / 128
Effective Formative Evaluation: The Underlying Logic…
137 Views
Download Presentation
osric
Download Presentation

Effective Formative Evaluation: The Underlying Logic…

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. Formative Assessment: Specific Tools to Measure Student Academic Skills and BehaviorJim Wrightwww.interventioncentral.org

  2. What is the relevant academic or behavioral outcome measure to be tracked? • Is the focus the core curriculum or system, subgroups of underperforming learners, or individual struggling students? • What method(s) should be used to measure the target academic skill or behavior? • What goal(s) are set for improvement? • How does the school check up on progress toward the goal(s)? Effective Formative Evaluation: The Underlying Logic…

  3. Use Time & Resources Efficiently By Collecting Information Only on ‘Things That Are Alterable’ “…Time should be spent thinking about things that the intervention team can influence through instruction, consultation, related services, or adjustments to the student’s program. These are things that are alterable.…Beware of statements about cognitive processes that shift the focus from the curriculum and may even encourage questionable educational practice. They can also promote writing off a student because of the rationale that the student’s insufficient performance is due to a limited and fixed potential. “ p.359 Source: Howell, K. W., Hosp, J. L., & Kurns, S. (2008). Best practices in curriculum-based evaluation. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp.349-362). Bethesda, MD: National Association of School Psychologists.

  4. School Instructional Time: The Irreplaceable Resource “In the average school system, there are 330 minutes in the instructional day, 1,650 minutes in the instructional week, and 56,700 minutes in the instructional year. Except in unusual circumstances, these are the only minutes we have to provide effective services for students. The number of years we have to apply these minutes is fixed. Therefore, each minute counts and schools cannot afford to support inefficient models of service delivery.” p. 177 Source: Batsche, G. M., Castillo, J. M., Dixon, D. N., & Forde, S. (2008). Best practices in problem analysis. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 177-193).

  5. Summative data is static information that provides a fixed ‘snapshot’ of the student’s academic performance or behaviors at a particular point in time. School records are one source of data that is often summative in nature—frequently referred to as archival data. Attendance data and office disciplinary referrals are two examples of archival records, data that is routinely collected on all students. In contrast to archival data, background information is collected specifically on the target student. Examples of background information are teacher interviews and student interest surveys, each of which can shed light on a student’s academic or behavioral strengths and weaknesses. Like archival data, background information is usually summative, providing a measurement of the student at a single point in time.

  6. Formative assessment measures are those that can be administered or collected frequently—for example, on a weekly or even daily basis. These measures provide a flow of regularly updated information (progress monitoring) about the student’s progress in the identified area(s) of academic or behavioral concern. Formative data provide a ‘moving picture’ of the student; the data unfold through time to tell the story of that student’s response to various classroom instructional and behavior management strategies. Examples of measures that provide formative data are Curriculum-Based Measurement probes in oral reading fluency and Daily Behavior Report Cards.

  7. Formal Assessment Defined “Formative assessment [in academics] refers to the gathering and use of information about students’ ongoing learning by both teachers and students to modify teaching and learning activities. …. Today…there are compelling research results indicating that the practice of formative assessment may be the most significant single factor in raising the academic achievement of all students—and especially that of lower-achieving students.” p. 7 Source: Harlen, W. (2003). Enhancing inquiry through formative assessment. San Francisco, CA: Exploratorium. Retrieved on September 17, 2008, from http://www.exploratorium.edu/ifi/resources/harlen_monograph.pdf

  8. Formative Assessment: Essential Questions… 1. What is the relevant academic or behavioral outcome measure to be tracked? Problems identified for formative assessment should be: • Important to school stakeholders. • Measureable & observable. • Stated positively as ‘replacement behaviors’ or goal statements rather than as general negative concerns (Bastche et al., 2008). • Based on a minimum of inference (T. Christ, 2008). Source: Batsche, G. M., Castillo, J. M., Dixon, D. N., & Forde, S. (2008). Best practices in problem analysis. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 177-193).Christ, T. (2008). Best practices in problem analysis. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 159-176).

  9. Academic or Behavioral Targets Are Stated as ‘Replacement Behaviors’ “The implementation of successful interventions begins with accurate problem identification. Traditionally, the student problem was stated as a broad, general concern (e.g., impulsive, aggressive, reading below grade level) that a teacher identified. In a competency-based approach, however, the problem identification is stated in terms of the desired replacement behaviors that will increase the student’s probability of successful adaptation to the task demands of the academic setting.” p. 178 Source: Batsche, G. M., Castillo, J. M., Dixon, D. N., & Forde, S. (2008). Best practices in problem analysis. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 177-193).

  10. Inference: Moving Beyond the Margins of the ‘Known’ “An inference is a tentative conclusion without direct or conclusive support from available data. All hypotheses are, by definition, inferences. It is critical that problem analysts make distinctions between what is known and what is inferred or hypothesized….Low-level inferences should be exhausted prior to the use of high-level inferences.” p. 161 Source: Christ, T. (2008). Best practices in problem analysis. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 159-176).

  11. High-Inference Hypothesis. The student has an auditory processing issue that prevents success in reading. The student requires a multisensory approach to reading instruction to address reading deficits. Unknown Known Unknown Low-Inference Hypothesis. The student needs to build reading fluency skills to become more proficient in decoding. Known Examples of High vs. Low Inference Hypotheses The results of grade-wide benchmarking in reading show that a target 2nd-grade student can read aloud at approximately half the rate of the median child in the grade.

  12. Adopting a Low-Inference Model of Reading Skills • 5 Big Ideas in Beginning Reading • Phonemic Awareness • Alphabetic Principle • Fluency with Text • Vocabulary • Comprehension Source: Source: Big ideas in beginning reading. University of Oregon. Retrieved September 23, 2007, from http://reading.uoregon.edu/index.php

  13. Formative Assessment: Essential Questions… 2. Is the focus the core curriculum or system, subgroups of underperforming learners, or individual struggling students? Apply the ‘80-15-5 ‘Rule (T. Christ, 2008) : • If fewer than 80% of students are successfully meeting academic or behavioral goals, the formative assessment focus is on the core curriculum and general student population. • If no more than 15% of students are not successful in meeting academic or behavioral goals, the formative assessment focus is on small-group ‘treatments’ or interventions. • If no more than 5% of students are not successful in meeting academic or behavioral goals, the formative assessment focus is on the individual student. Source: Christ, T. (2008). Best practices in problem analysis. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 159-176).

  14. RTI Literacy: Assessment & Progress-Monitoring To measure student ‘response to instruction/intervention’ effectively, the RTI Literacy model measures students’ reading performance and progress on schedules matched to each student’s risk profile and intervention Tier membership. • Benchmarking/Universal Screening. All children in a grade level are assessed at least 3 times per year on a common collection of literacy assessments. • Strategic Monitoring. Students placed in Tier 2 (supplemental) reading groups are assessed 1-2 times per month to gauge their progress with this intervention. • Intensive Monitoring. Students who participate in an intensive, individualized Tier 3 reading intervention are assessed at least once per week. Source: Burns, M. K., & Gibbons, K. A. (2008). Implementing response-to-intervention in elementary and secondary schools: Procedures to assure scientific-based practices. New York: Routledge.

  15. Using Local Norms in Coordination with Benchmark Data

  16. Baylor Elementary School : Grade Norms: Correctly Read Words Per Min : Sample Size: 23 Students Group Norms: Correctly Read Words Per Min: Book 4-1: Raw Data 31 34 34 39 41 43 52 55 59 61 68 71 74 75 85 89 102 108 112 115 118 118 131 LOCAL NORMS EXAMPLE: Twenty-three 4th-grade students were administered oral reading fluency Curriculum-Based Measurement passages at the 4th-grade level in their school. • In their current number form, these data are not easy to interpret. • So the school converts them into a visual display—a box-plot —to show the distribution of scores and to convert the scores to percentile form. • When Billy, a struggling reader, is screened in CBM reading fluency, he shows a SIGNIFICANT skill gap when compared to his grade peers.

  17. Median (2nd Quartile)=71 Group Norms: Converted to Box-Plot National Reading Norms: 112 CRW Per Min 1st Quartile=43 3rd Quartile=108 Source: Tindal, G., Hansbrouck, J., & Jones, C. (2005).Oral reading fluency: 90 years of measurement [Technical report #33]. Eugene, OR: University of Oregon. Billy=19 Hi Value=131 Low Value=31 0 20 40 60 80 100 120 140 160 Correctly Read Words-Book 4-1 Baylor Elementary School : Grade Norms: Correctly Read Words Per Min : Sample Size: 23 Students January Benchmarking Group Norms: Correctly Read Words Per Min: Book 4-1: Raw Data 31 34 34 39 41 43 52 55 59 61 68 71 74 75 85 89 102 108 112 115 118 118 131

  18. Team Activity: Formative Assessment and Your Schools • At your tables, discuss: • What kinds of formative measures your schools tend to collect most often. • How ‘ready’ your schools are to collect, interpret, and act on formative assessment data..

  19. Formative Assessment: Essential Questions… 3. What method(s) should be used to measure the target academic skill or behavior? Formative assessment methods should be as direct a measure as possible of the problem or issue being evaluated. These assessment methods can: • Consist of General Outcome Measures or Specific Sub-Skill Mastery Measures • Include existing (‘extant’) data from the school system Curriculum-Based Measurement (CBM) is widely used to track basic student academic skills. Daily Behavior Report Cards (DBRCs) are increasingly used as one source of formative behavioral data. Source: Burns, M. K., & Gibbons, K. A. (2008). Implementing response-to-intervention in elementary and secondary schools: Procedures to assure scientific-based practices. New York: Routledge.

  20. Formal Tests: Only One Source of Student Assessment Information “Tests are often overused and misunderstood in and out of the field of school psychology. When necessary, analog [i.e., test] observations can be used to test relevant hypotheses within controlled conditions. Testing is a highly standardized form of observation. ….The only reason to administer a test is to answer well-specified questions and examine well-specified hypotheses. It is best practice to identify and make explicit the most relevant questions before assessment begins. …The process of assessment should follow these questions. The questions should not follow assessment. “ p.170 Source: Christ, T. (2008). Best practices in problem analysis. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 159-176). Bethesda, MD: National Association of School Psychologists.

  21. Making Use of Existing (‘Extant’) Data

  22. Extant (Existing) Data (Chafouleas et al., 2007) • Definition: Information that is collected by schools as a matter of course. • Extant data comes in two forms: • Performance summaries (e.g., class grades, teacher summary comments on report cards, state test scores). • Student work products (e.g., research papers, math homework, PowerPoint presentation). Source: Chafouleas, S., Riley-Tillman, T.C., & Sugai, G. (2007). School-based behavioral assessment: Informing intervention and instruction. New York: Guilford Press.

  23. Advantages of Using Extant Data (Chafouleas et al., 2007) • Information is already existing and easy to access. • Students will not show ‘reactive’ effects when data is collected, as the information collected is part of the normal routine of schools. • Extant data is ‘relevant’ to school data consumers (such as classroom teachers, administrators, and members of problem-solving teams). Source: Chafouleas, S., Riley-Tillman, T.C., & Sugai, G. (2007). School-based behavioral assessment: Informing intervention and instruction. New York: Guilford Press.

  24. Drawbacks of Using Extant Data (Chafouleas et al., 2007) • Time is required to collate and summarize the data (e.g., summarizing a week’s worth of disciplinary office referrals). • The data may be limited and not reveal the full dimension of the student’s presenting problem(s). • There is no guarantee that school staff are consistent and accurate in how they collect the data (e.g., grading policies can vary across classrooms; instructors may have differing expectations regarding what types of assignments are given a formal grade; standards may fluctuate across teachers for filling out disciplinary referrals). • Little research has been done on the ‘psychometric adequacy’ of extant data sources. Source: Chafouleas, S., Riley-Tillman, T.C., & Sugai, G. (2007). School-based behavioral assessment: Informing intervention and instruction. New York: Guilford Press.

  25. Curriculum-Based Measurement: Assessing Basic Academic Skills

  26. Curriculum-Based Measurement: Advantages as a Set of Tools to Monitor RTI/Academic Cases • Aligns with curriculum-goals and materials • Is reliable and valid (has ‘technical adequacy’) • Is criterion-referenced: sets specific performance levels for specific tasks • Uses standard procedures to prepare materials, administer, and score • Samples student performance to give objective, observable ‘low-inference’ information about student performance • Has decision rules to help educators to interpret student data and make appropriate instructional decisions • Is efficient to implement in schools (e.g., training can be done quickly; the measures are brief and feasible for classrooms, etc.) • Provides data that can be converted into visual displays for ease of communication Source: Hosp, M.K., Hosp, J. L., & Howell, K. W. (2007). The ABCs of CBM. New York: Guilford.

  27. CBM Student Reading Samples: What Difference Does Fluency Make? • 3rd Grade: 19 Words Per Minute • 3rd Grade: 70 Words Per Minute • 3rd Grade: 98 Words Per Minute

  28. CBM techniques have been developed to assess: • Phonemic awareness skills • Reading fluency • Reading comprehension • Early math skills • Math computation • Math applications & concepts • Writing • Spelling

  29. CBM Math Measures: Selected Sources • AimsWeb (http://www.aimsweb.com) • Easy CBM (http://www.easycbm.com) • iSteep (http://www.isteep.com) • EdCheckup (http://www.edcheckup.com) • Intervention Central (http://www.interventioncentral.org)

  30. Measuring General vs. Specific Academic Outcomes • General Outcome Measures: Track the student’s increasing proficiency on general curriculum goals such as reading fluency. Example: CBM-Oral Reading Fluency (Hintz et al., 2006). • Specific Sub-Skill Mastery Measures: Track short-term student academic progress with clear criteria for mastery (Burns & Gibbons, 2008). Example: Letter Identification. Sources: Burns, M. K., & Gibbons, K. A. (2008). Implementing response-to-intervention in elementary and secondary schools: Procedures to assure scientific-based practices. New York: Routledge. Hintz, J. M., Christ, T. J., & Methe, S. A. (2006). Curriculum-based assessment. Psychology in the Schools, 43, 45-56.

  31. Monitoring Student Academic Behaviors:Daily Behavior Report Cards

  32. Daily Behavior Report Cards (DBRCs) Are… • brief forms containing student behavior-rating items. The teacher typically rates the student daily (or even more frequently) on the DBRC. The results can be graphed to document student response to an intervention.

  33. Daily Behavior Report Cards Can Monitor… • Hyperactivity • On-Task Behavior (Attention) • Work Completion • Organization Skills • Compliance With Adult Requests • Ability to Interact Appropriately With Peers

  34. Jim Blalock May 5 Mrs. Williams Rm 108 Daily Behavior Report Card: Daily Version

  35. Jim Blalock Mrs. Williams Rm 108 Daily Behavior Report Card: Weekly Version 05 05 07 05 06 07 05 07 07 05 08 07 05 09 07 40 0 60 60 50

  36. Daily Behavior Report Card: Chart

  37. Formative Assessment: Essential Questions… 4. What goal(s) are set for improvement? Goals are defined at the system, group, or individual student level. Goal statements: • Are worded in measureable, observable terms, • Include a timeline for achieving those goals. • Are tied to the formative assessment methods used to monitor progress toward the goal(s).

  38. IEP Goal Statements for CBA/CBM

  39. Writing CBM Goals in Student IEPs (Wright, 1992) Source: Wright, J. (1992). Curriculum-based measurement: A manual for teachers. Retrieved on September 4, 2008, from http://www.jimwrightonline.com/pdfdocs/cbaManual.pdf

  40. Writing CBM Goals in Student IEPs (Wright, 1992) Source: Wright, J. (1992). Curriculum-based measurement: A manual for teachers. Retrieved on September 4, 2008, from http://www.jimwrightonline.com/pdfdocs/cbaManual.pdf

  41. Writing CBM Goals in Student IEPs (Wright, 1992) Source: Wright, J. (1992). Curriculum-based measurement: A manual for teachers. Retrieved on September 4, 2008, from http://www.jimwrightonline.com/pdfdocs/cbaManual.pdf

  42. Reading In [number of weeks until Annual Review], when given a randomly selected passage from [level and name of reading series] for 1 minute Student will read aloud At [number] correctly read words with no more than [number] decoding errors. IEP Goals for CBA/CBM: READING

  43. Written Expression In [number of weeks until Annual Review], when given a story starter or topic sentence and 3 minutes in which to write Student will write IEP Goals for CBA/CBM: Written Expression A total of: [number] of wordsor [number] of correctly spelled wordsor [number] of correct word/writing sequences

  44. Spelling In [number of weeks until Annual Review], when dictated randomly selected words from [level and name of spelling series or description of spelling word list] for 2 minutes Student will write [Number of correct letter sequences] IEP Goals for CBA/CBM: Spelling

  45. Interpreting Data: The Power of Visual Display

  46. Sample Peer Tutoring Chart

  47. Sample Peer Tutoring Chart

  48. Single-Subject (Applied) Research Designs “Single-case designs evolved because of the need to understand patterns of individual behavior in response to independent variables, and more practically, to examine intervention effectiveness. Design use can be flexible, described as a process of response-guided experimentation…, providing a mechanism for documenting attempts to live up to legal mandates for students who are not responding to routine instructional methods.” p. 71 Source: Barnett, D. W., Daly, E. J., Jones, K. M., & Lentz, F.E. (2004). Response to intervention: Empirically based special service decisions from single-case designs of increasing and decreasing intensity. Journal of Special Education, 38, 66-79.

  49. 3 17 1 20 1 27 1 13 4 14 2 10 2 3 3 3 3 10 3 24 3 31 4 7 2 24 4 11 2 28 2 7 2 14 1 31 3 7 4 18 3 14 3 21 3 28 1 17 4 4 1 24 Jared: Intervention Phase 1: Weeks 1-6 X X F 3/7 82 CRW Th 2/27 79 CRW W 1/29 77 CRW Th 2/13 75 CRW M 2/3 75 CRW W 1/22 71 CRW