slide1 l.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Formative Assessment Overview: Specific Assessment Tools to Measure Student Literacy Skills and Behavior Jim Wright int PowerPoint Presentation
Download Presentation
Formative Assessment Overview: Specific Assessment Tools to Measure Student Literacy Skills and Behavior Jim Wright int

Loading in 2 Seconds...

play fullscreen
1 / 81

Formative Assessment Overview: Specific Assessment Tools to Measure Student Literacy Skills and Behavior Jim Wright int - PowerPoint PPT Presentation


  • 357 Views
  • Uploaded on

Formative Assessment Overview: Specific Assessment Tools to Measure Student Literacy Skills and Behavior Jim Wright www.interventioncentral.org. What is the relevant academic or behavioral outcome measure to be tracked?.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Formative Assessment Overview: Specific Assessment Tools to Measure Student Literacy Skills and Behavior Jim Wright int' - mercer


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
slide1

Formative Assessment Overview: Specific Assessment Tools to Measure Student Literacy Skills and BehaviorJim Wrightwww.interventioncentral.org

effective formative evaluation the underlying logic

What is the relevant academic or behavioral outcome measure to be tracked?

  • Is the focus the core curriculum or system, subgroups of underperforming learners, or individual struggling students?
  • What method(s) should be used to measure the target academic skill or behavior?
  • What goal(s) are set for improvement?
  • How does the school check up on progress toward the goal(s)?

Effective Formative Evaluation: The Underlying Logic…

slide3

Summative data is static information that provides a fixed ‘snapshot’ of the student’s academic performance or behaviors at a particular point in time. School records are one source of data that is often summative in nature—frequently referred to as archival data. Attendance data and office disciplinary referrals are two examples of archival records, data that is routinely collected on all students. In contrast to archival data, background information is collected specifically on the target student. Examples of background information are teacher interviews and student interest surveys, each of which can shed light on a student’s academic or behavioral strengths and weaknesses. Like archival data, background information is usually summative, providing a measurement of the student at a single point in time.

slide4

Formative assessment measures are those that can be administered or collected frequently—for example, on a weekly or even daily basis. These measures provide a flow of regularly updated information (progress monitoring) about the student’s progress in the identified area(s) of academic or behavioral concern. Formative data provide a ‘moving picture’ of the student; the data unfold through time to tell the story of that student’s response to various classroom instructional and behavior management strategies. Examples of measures that provide formative data are Curriculum-Based Measurement probes in oral reading fluency and Daily Behavior Report Cards.

formal assessment defined
Formal Assessment Defined

“Formative assessment [in academics] refers to the gathering and use of information about students’ ongoing learning by both teachers and students to modify teaching and learning activities. …. Today…there are compelling research results indicating that the practice of formative assessment may be the most significant single factor in raising the academic achievement of all students—and especially that of lower-achieving students.” p. 7

Source: Harlen, W. (2003). Enhancing inquiry through formative assessment. San Francisco, CA: Exploratorium. Retrieved on September 17, 2008, from http://www.exploratorium.edu/ifi/resources/harlen_monograph.pdf

academic or behavioral targets are stated as replacement behaviors
Academic or Behavioral Targets Are Stated as ‘Replacement Behaviors’

“A problem solution is defined as one or more changes to the instruction, curriculum, or environment that function(s) to reduce or eliminate a problem.” p. 159

Source: Christ, T. (2008). Best practices in problem analysis. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 159-176).

school instructional time the irreplaceable resource
School Instructional Time: The Irreplaceable Resource

“In the average school system, there are 330 minutes in the instructional day, 1,650 minutes in the instructional week, and 56,700 minutes in the instructional year. Except in unusual circumstances, these are the only minutes we have to provide effective services for students. The number of years we have to apply these minutes is fixed. Therefore, each minute counts and schools cannot afford to support inefficient models of service delivery.” p. 177

Source: Batsche, G. M., Castillo, J. M., Dixon, D. N., & Forde, S. (2008). Best practices in problem analysis. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 177-193).

formative assessment essential questions
Formative Assessment: Essential Questions…

1. What is the relevant academic or behavioral outcome measure to be tracked?

Problems identified for formative assessment should be:

  • Important to school stakeholders.
  • Measureable & observable.
  • Stated positively as ‘replacement behaviors’ or goal statements rather than as general negative concerns (Bastche et al., 2008).
  • Based on a minimum of inference (T. Christ, 2008).

Source: Batsche, G. M., Castillo, J. M., Dixon, D. N., & Forde, S. (2008). Best practices in problem analysis. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 177-193).Christ, T. (2008). Best practices in problem analysis. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 159-176).

academic or behavioral targets are stated as replacement behaviors9
Academic or Behavioral Targets Are Stated as ‘Replacement Behaviors’

“The implementation of successful interventions begins with accurate problem identification. Traditionally, the student problem was stated as a broad, general concern (e.g., impulsive, aggressive, reading below grade level) that a teacher identified. In a competency-based approach, however, the problem identification is stated in terms of the desired replacement behaviors that will increase the student’s probability of successful adaptation to the task demands of the academic setting.” p. 178

Source: Batsche, G. M., Castillo, J. M., Dixon, D. N., & Forde, S. (2008). Best practices in problem analysis. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 177-193).

inference moving beyond the margins of the known
Inference: Moving Beyond the Margins of the ‘Known’

“An inference is a tentative conclusion without direct or conclusive support from available data. All hypotheses are, by definition, inferences. It is critical that problem analysts make distinctions between what is known and what is inferred or hypothesized….Low-level inferences should be exhausted prior to the use of high-level inferences.” p. 161

Source: Christ, T. (2008). Best practices in problem analysis. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 159-176).

examples of high vs low inference hypotheses

High-Inference Hypothesis. The student has an auditory processing issue that prevents success in reading. The student requires a multisensory approach to reading instruction to address reading deficits.

Unknown

Known

Unknown

Low-Inference Hypothesis. The student needs to build reading fluency skills to become more proficient in decoding.

Known

Examples of High vs. Low Inference Hypotheses

The results of grade-wide benchmarking in reading show that a target 2nd-grade student can read aloud at approximately half the rate of the median child in the grade.

adopting a low inference model of reading skills
Adopting a Low-Inference Model of Reading Skills
  • 5 Big Ideas in Beginning Reading
    • Phonemic Awareness
    • Alphabetic Principle
    • Fluency with Text
    • Vocabulary
    • Comprehension

Source: Source: Big ideas in beginning reading. University of Oregon. Retrieved September 23, 2007, from http://reading.uoregon.edu/index.php

formative assessment essential questions13
Formative Assessment: Essential Questions…

2. Is the focus the core curriculum or system, subgroups of underperforming learners, or individual struggling students?

Apply the ‘80-15-5 ‘Rule (T. Christ, 2008) :

  • If less than 80% of students are successfully meeting academic or behavioral goals, the formative assessment focus is on the core curriculum and general student population.
  • If no more than 15% of students are not successful in meeting academic or behavioral goals, the formative assessment focus is on small-group ‘treatments’ or interventions.
  • If no more than 5% of students are not successful in meeting academic or behavioral goals, the formative assessment focus is on the individual student.

Source: Christ, T. (2008). Best practices in problem analysis. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 159-176).

slide15

Baylor Elementary School : Grade Norms: Correctly Read Words Per Min : Sample Size: 23 Students

Group Norms: Correctly Read Words Per Min: Book 4-1: Raw Data

31 34 34 39 41 43 52 55 59 61 68 71 74 75 85 89 102 108 112 115 118 118 131

LOCAL NORMS EXAMPLE: Twenty-three 4th-grade students were administered oral reading fluency Curriculum-Based Measurement passages at the 4th-grade level in their school.

  • In their current number form, these data are not easy to interpret.
  • So the school converts them into a visual display—a box-plot —to show the distribution of scores and to convert the scores to percentile form.
  • When Billy, a struggling reader, is screened in CBM reading fluency, he shows a SIGNIFICANT skill gap when compared to his grade peers.
slide16

Median (2nd Quartile)=71

Group Norms: Converted to Box-Plot

National Reading Norms: 112 CRW Per Min

1st Quartile=43

3rd Quartile=108

Source: Tindal, G., Hansbrouck, J., & Jones, C. (2005).Oral reading fluency: 90 years of measurement [Technical report #33]. Eugene, OR: University of Oregon.

Billy=19

Hi Value=131

Low Value=31

0 20 40 60 80 100 120 140 160

Correctly Read Words-Book 4-1

Baylor Elementary School : Grade Norms: Correctly Read Words Per Min : Sample Size: 23 Students January Benchmarking

Group Norms: Correctly Read Words Per Min: Book 4-1: Raw Data

31 34 34 39 41 43 52 55 59 61 68 71 74 75 85 89 102 108 112 115 118 118 131

slide17

Team Activity: Formative Assessment and Your Schools

  • At your tables, discuss:
  • What kinds of formative measures your schools tend to collect most often.
  • How ‘ready’ your schools are to collect, interpret, and act on formative assessment data..
formative assessment essential questions18
Formative Assessment: Essential Questions…

3. What method(s) should be used to measure the target academic skill or behavior?

Formative assessment methods should be as direct a measure as possible of the problem or issue being evaluated. These assessment methods can:

  • Consist of General Outcome Measures or Specific Sub-Skill Mastery Measures
  • Include existing (‘extant’) data from the school system

Curriculum-Based Measurement (CBM) is widely used to track basic student academic skills. Daily Behavior Report Cards (DBRCs) are increasingly used as one source of formative behavioral data.

Source: Burns, M. K., & Gibbons, K. A. (2008). Implementing response-to-intervention in elementary and secondary schools: Procedures to assure scientific-based practices. New York: Routledge.

extant existing data chafouleas et al 2007
Extant (Existing) Data (Chafouleas et al., 2007)
  • Definition: Information that is collected by schools as a matter of course.
  • Extant data comes in two forms:
    • Performance summaries (e.g., class grades, teacher summary comments on report cards, state test scores).
    • Student work products (e.g., research papers, math homework, PowerPoint presentation).

Source: Chafouleas, S., Riley-Tillman, T.C., & Sugai, G. (2007). School-based behavioral assessment: Informing intervention and instruction. New York: Guilford Press.

advantages of using extant data chafouleas et al 2007
Advantages of Using Extant Data (Chafouleas et al., 2007)
  • Information is already existing and easy to access.
  • Students will not show ‘reactive’ effects when data is collected, as the information collected is part of the normal routine of schools.
  • Extant data is ‘relevant’ to school data consumers (such as classroom teachers, administrators, and members of problem-solving teams).

Source: Chafouleas, S., Riley-Tillman, T.C., & Sugai, G. (2007). School-based behavioral assessment: Informing intervention and instruction. New York: Guilford Press.

drawbacks of using extant data chafouleas et al 2007
Drawbacks of Using Extant Data (Chafouleas et al., 2007)
  • Time is required to collate and summarize the data (e.g., summarizing a week’s worth of disciplinary office referrals).
  • The data may be limited and not reveal the full dimension of the student’s presenting problem(s).
  • There is no guarantee that school staff are consistent and accurate in how they collect the data (e.g., grading policies can vary across classrooms; instructors may have differing expectations regarding what types of assignments are given a formal grade; standards may fluctuate across teachers for filling out disciplinary referrals).
  • Little research has been done on the ‘psychometric adequacy’ of extant data sources.

Source: Chafouleas, S., Riley-Tillman, T.C., & Sugai, G. (2007). School-based behavioral assessment: Informing intervention and instruction. New York: Guilford Press.

grades other teacher performance summary data chafouleas et al 2007
Grades & Other Teacher Performance Summary Data (Chafouleas et al., 2007)
  • Teacher test and quiz grades can be useful as a supplemental method for monitoring the impact of student behavioral interventions.
  • Other data about student academic performance (e.g., homework completion, homework grades, etc.) can also be tracked and graphed to judge intervention effectiveness.

Source: Chafouleas, S., Riley-Tillman, T.C., & Sugai, G. (2007). School-based behavioral assessment: Informing intervention and instruction. New York: Guilford Press.

slide25

2-Wk

9/23/07

4-Wk

10/07/07

6-Wk

10/21/07

8-Wk

11/03/07

10-Wk

11/20/07

12-Wk

12/05/07

Marc Ripley

(From Chafouleas et al., 2007)

Source: Chafouleas, S., Riley-Tillman, T.C., & Sugai, G. (2007). School-based behavioral assessment: Informing intervention and instruction. New York: Guilford Press.

academic measures can serve as indicators of improved student behavior
Academic Measures Can Serve As Indicators of Improved Student Behavior

Academic measures (e.g., grades, CBM data) can be useful as part of the progress-monitoring ‘portfolio’ of data collected on a student because:

  • Students with problem behaviors often struggle academically, so tracking academics as a target is justified in its own right.
  • Improved academic performance generally correlates with reduced behavioral problems.
  • Individualized interventions for misbehaving students frequently contain academic components (as the behavior problems can emerge in response to chronic academic deficits). Academic progress-monitoring data helps the school to track the effectiveness of the academic interventions.
commercial tests limitations
Commercial Tests: Limitations
  • Compare child to ‘national’ average rather than to class or school peers
  • Have unknown overlap with student curriculum, classroom content
  • Can be given only infrequently
  • Are not sensitive to short-term student gains in academic skills
curriculum based evaluation definition
Curriculum-Based Evaluation: Definition

“Whereas standardized commercial achievement tests measure broad curriculum areas and/or skills, CBE measures specific skills that are presently being taught in the classroom, usually in basic skills. Several approaches to CBE have been developed. Four common characteristics exist across these models:

  • The measurement procedures assess students directly using the materials in which they are being instructed. This involves sampling items from the curriculum.
  • Administration of each measure is generally brief in duration (typically 1-5 mins.)
  • The design is structured such that frequent and repeated measurement is possible and measures are sensitive to change.
  • Data are usually displayed graphically to allow monitoring of student performance.”

SOURCE: CAST Website: http://www.cast.org/publications/ncac/ncac_curriculumbe.html

slide33

SOURCE: CAST Website: http://www.cast.org/publications/ncac/ncac_curriculumbe.html

slide34

Curriculum-Based Measurement/ Assessment: Defining Characteristics:

  • Assesses preselected objectives from local curriculum
  • Has standardized directions for administration
  • Is timed, yielding fluency, accuracy scores
  • Uses objective, standardized, ‘quick’ guidelines for scoring
  • Permits charting and teacher feedback

Source: Wright, J. (1992). Curriculum-based measurement: A manual for teachers. Retrieved on September 4, 2008, from http://www.jimwrightonline.com/pdfdocs/cbaManual.pdf

cbm student reading samples what difference does fluency make
CBM Student Reading Samples: What Difference Does Fluency Make?
  • 3rd Grade: 19 Words Per Minute
  • 3rd Grade: 70 Words Per Minute
  • 3rd Grade: 98 Words Per Minute
slide36

CBM Techniques have been developed to assess:

  • Reading fluency
  • Reading comprehension
  • Math computation
  • Writing
  • Spelling
  • Phonemic awareness skills
  • Early math skills
measuring general vs specific academic outcomes
Measuring General vs. Specific Academic Outcomes
  • General Outcome Measures: Track the student’s increasing proficiency on general curriculum goals such as reading fluency. An example is CBM-Oral Reading Fluency (Hintz et al., 2006).
  • Specific Sub-Skill Mastery Measures: Track short-term student academic progress with clear criteria for mastery (Burns & Gibbons, 2008). An example is Letter Identification.

Sources: Burns, M. K., & Gibbons, K. A. (2008). Implementing response-to-intervention in elementary and secondary schools: Procedures to assure scientific-based practices. New York: Routledge. Hintz, J. M., Christ, T. J., & Methe, S. A. (2006). Curriculum-based assessment. Psychology in the Schools, 43, 45-56.

assessing basic academic skills curriculum based measurement
Assessing Basic Academic Skills: Curriculum-Based Measurement

Reading: These 3 measures all proved ‘adequate predictors’ of student performance on reading content tasks:

  • Reading aloud (Oral Reading Fluency): Passages from content-area tests: 1 minute.
  • Maze task (every 7th item replaced with multiple choice/answer plus 2 distracters): Passages from content-area texts: 2 minutes.
  • Vocabulary matching: 10 vocabulary items and 12 definitions (including 2 distracters): 10 minutes.

Source: Espin, C. A., & Tindal, G. (1998). Curriculum-based measurement for secondary students. In M. R. Shinn (Ed.) Advanced applications of curriculum-based measurement. New York: Guilford Press.

assessing basic academic skills curriculum based measurement43
Assessing Basic Academic Skills: Curriculum-Based Measurement

Writing: CBM/ Word Sequence is a ‘valid indicator of general writing proficiency’. It evaluates units of writing and their relation to one another. Successive pairs of ‘writing units’ make up each word sequence. The mechanics and conventions of each word sequence must be correct for the student to receive credit for that sequence. CBM/ Word Sequence is the most comprehensive CBM writing measure.

Source: Espin, C. A., & Tindal, G. (1998). Curriculum-based measurement for secondary students. In M. R. Shinn (Ed.) Advanced applications of curriculum-based measurement. New York: Guilford Press.

curriculum based evaluation math vocabulary
Format Option 1

20 vocabulary terms appear alphabetically in the right column. Items are drawn randomly from a ‘vocabulary pool’

Randomly arranged definitions appear in the left column.

The student writes the letter of the correct term next to each matching definition.

The student receives 1 point for each correct response.

Each probe lasts 5 minutes.

2-3 probes are given in a session.

Curriculum-Based Evaluation: Math Vocabulary

Source: Howell, K. W. (2008). Best practices in curriculum-based evaluation and advanced reading. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 397-418).

curriculum based evaluation math vocabulary45
Format Option 2

20 randomly arranged vocabulary definitions appear in the right column. Items are drawn randomly from a ‘vocabulary pool’

The student writes the name of the correct term next to each matching definition.

The student is given 0.5 point for each correct term and another 0.5 point if the term is spelled correctly.

Each probe lasts 5 minutes.

2-3 probes are given in a session.

Curriculum-Based Evaluation: Math Vocabulary

Source: Howell, K. W. (2008). Best practices in curriculum-based evaluation and advanced reading. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 397-418).

daily behavior report cards dbrcs are
Daily Behavior Report Cards (DBRCs) Are…
  • brief forms containing student behavior-rating items. The teacher typically rates the student daily (or even more frequently) on the DBRC. The results can be graphed to document student response to an intervention.
daily behavior report cards can monitor
Daily Behavior Report Cards Can Monitor…
  • Hyperactivity
  • On-Task Behavior (Attention)
  • Work Completion
  • Organization Skills
  • Compliance With Adult Requests
  • Ability to Interact Appropriately With Peers
slide49

Jim Blalock

May 5

Mrs. Williams

Rm 108

Daily Behavior Report Card:

Daily Version

slide50

Jim Blalock

Mrs. Williams

Rm 108

Daily Behavior Report Card:

Weekly Version

05 05 07

05 06 07

05 07 07

05 08 07

05 09 07

40

0

60

60

50

student case scenario jim
Student Case Scenario: Jim

Jim is a 10th-grade student who is failing his math course and in danger of failing English and science courses. Jim has been identified with ADHD. His instructional team meets with the RTI Team and list the following academic and behavioral concerns for Jim.

  • Does not bring work materials to class
  • Fails to write down homework assignments
  • Sometimes does not turn in homework, even when completed
  • Can be non-compliant with teacher requests at times.
formative assessment essential questions54
Formative Assessment: Essential Questions…

4. What goal(s) are set for improvement?

Goals are defined at the system, group, or individual student level. Goal statements:

  • Are worded in measureable, observable terms,
  • Include a timeline for achieving those goals.
  • Are tied to the formative assessment methods used to monitor progress toward the goal(s).
single subject applied research designs
Single-Subject (Applied) Research Designs

“Single-case designs evolved because of the need to understand patterns of individual behavior in response to independent variables, and more practically, to examine intervention effectiveness. Design use can be flexible, described as a process of response-guided experimentation…, providing a mechanism for documenting attempts to live up to legal mandates for students who are not responding to routine instructional methods.” p. 71

Source: Barnett, D. W., Daly, E. J., Jones, K. M., & Lentz, F.E. (2004). Response to intervention: Empirically based special service decisions from single-case designs of increasing and decreasing intensity. Journal of Special Education, 38, 66-79.

single subject applied research designs steps
Single-Subject (Applied) Research Designs: Steps

“The basic methods [of single-case designs] are

  • selecting socially important variables as dependent measures or target behaviors
  • taking repeated measures until stable patterns emerge so that participants may serve as their own controls (i.e., baseline)
  • implementing a well-described intervention or discrete intervention trials
  • continuing measurement of both the dependent and independent variables within an acceptable pattern of intervention application and/or withdrawal to detect changes in behavior and make efficacy attributions
  • graphically analyzing the results to enable ongoing comparisons of the student’s performance under baseline and intervention conditions, and
  • replicating the results to reach the ultimate goal of the dissemination of effective practices.”

Source: Barnett, D. W., Daly, E. J., Jones, K. M., & Lentz, F.E. (2004). Response to intervention: Empirically based special service decisions from single-case designs of increasing and decreasing intensity. Journal of Special Education, 38, 66-79.

slide61

3 17

1 20

1 27

1 13

4 14

2 10

2 3

3 3

3 10

3 24

3 31

4 7

2 24

4 11

2 28

2 7

2 14

1 31

3 7

4 18

3 14

3 21

3 28

1 17

4 4

1 24

Jared: Intervention Phase 1: Weeks 1-6

X

X

F 3/7

82 CRW

Th 2/27

79 CRW

W 1/29

77 CRW

Th 2/13

75 CRW

M 2/3

75 CRW

W 1/22

71 CRW

writing cbm goals in student ieps wright 1992
Writing CBM Goals in Student IEPs (Wright, 1992)

Source: Wright, J. (1992). Curriculum-based measurement: A manual for teachers. Retrieved on September 4, 2008, from http://www.jimwrightonline.com/pdfdocs/cbaManual.pdf

writing cbm goals in student ieps wright 199266
Writing CBM Goals in Student IEPs (Wright, 1992)

Source: Wright, J. (1992). Curriculum-based measurement: A manual for teachers. Retrieved on September 4, 2008, from http://www.jimwrightonline.com/pdfdocs/cbaManual.pdf

writing cbm goals in student ieps wright 199267
Writing CBM Goals in Student IEPs (Wright, 1992)

Source: Wright, J. (1992). Curriculum-based measurement: A manual for teachers. Retrieved on September 4, 2008, from http://www.jimwrightonline.com/pdfdocs/cbaManual.pdf

iep goals for cba cbm reading

Reading

In [number of weeks until Annual Review], when given a randomly selected passage from [level and name of reading series] for 1 minute

Student will read aloud

At [number] correctly read words with no more than [number] decoding errors.

IEP Goals for CBA/CBM: READING
iep goals for cba cbm written expression

Written Expression

In [number of weeks until Annual Review], when given a story starter or topic sentence and 3 minutes in which to write

Student will write

IEP Goals for CBA/CBM: Written Expression

A total of:

[number] of wordsor

[number] of correctly spelled wordsor

[number] of correct word/writing sequences

iep goals for cba cbm spelling

Spelling

In [number of weeks until Annual Review], when dictated randomly selected words from [level and name of spelling series or description of spelling word list] for 2 minutes

Student will write

[Number of correct letter sequences]

IEP Goals for CBA/CBM: Spelling
formative assessment essential questions71
Formative Assessment: Essential Questions…

5. How does the school check up on progress toward the goal(s)?

The school periodically checks the formative assessment data to determine whether the goal is being attained. Examples of this progress evaluation process include the following:

  • System-Wide: A school-wide team meets on a monthly basis to review the frequency and type of office disciplinary referrals to judge whether those referrals have dropped below the acceptable threshold for student behavior.
  • Group Level: Teachers at a grade level assembles every six weeks to review CBM data on students receiving small-group supplemental instruction to determine whether students are ready to exit (Burns & Gibbons, 2008).
  • Individual Level: A building problem-solving team gathers every eight weeks to review CBM data to a student’s response to an intensive reading fluency plan.

Sources: Burns, M. K., & Gibbons, K. A. (2008). Implementing response-to-intervention in elementary and secondary schools: Procedures to assure scientific-based practices. New York: Routledge.

Shinn, M. R. (1989). Curriculum-based measurement: Assessing special children. New York: Guilford.

effective formative evaluation the underlying logic72

What is the relevant academic or behavioral outcome measure to be tracked?

  • Is the focus the core curriculum or system, subgroups of underperforming learners, or individual struggling students?
  • What method(s) should be used to measure the target academic skill or behavior?
  • What goal(s) are set for improvement?
  • How does the school check up on progress toward the goal(s)?

Effective Formative Evaluation: The Underlying Logic…

web sites for academic progress monitoring
Web Sites for Academic Progress-Monitoring
  • National Center on Student Progress-Monitoring (http://www.studentprogress.org/)
  • Curriculum-Based Measurement Warehouse (http://www.interventioncentral.org/htmdocs/interventions/cbmwarehouse.php)
  • DIBELS (https://dibels.uoregon.edu/)
  • AimsWeb (http://www.aimsweb.com/) [Pay Site]
  • EdCheckup (http://www.edcheckup.com/) [Pay Site]
slide75

National Center on Student Progress Monitoring http://www.studentprogress.org/

slide80

OKAPI CBM Reading Probe Generatorhttp://www.interventioncentral.org/htmdocs/tools/okapi/okapi.php

slide81

Team Activity: Formative Assessment and Your Schools

  • At your tables, discuss:
  • How the SETRC network can use the concepts and resources presented in this workshop in your daily practice.
  • What your first ‘action plan’ items might be to act on any of the workshop content presented.