1 / 51

CURRENT ASSESSMENTS

CURRENT ASSESSMENTS. What data on student performance is available? How is it used in referral/assessment process? What data needs are being met? What data needs are not being met? How do you know whether or not a program is benefiting a student(s)?. CURRENT TRENDS.

Jims
Download Presentation

CURRENT ASSESSMENTS

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CURRENT ASSESSMENTS • What data on student performance is available? • How is it used in referral/assessment process? • What data needs are being met? • What data needs are not being met? • How do you know whether or not a program is benefiting a student(s)? CHRISTO/CSUS/246/2003/CBM

  2. CURRENT TRENDS • Use of IQ/Achievement discrepancy • Response to intervention • Need for monitoring of progress on short term basis • Office of Special Education • National Joint Council on Learning Disabilities • President’s Commission on Special Education CHRISTO/CSUS/246/2003/CBM

  3. ROOTS OF CBM • Deno at University of Minnesota Institute for Research on Learning Disabilities (Deno, 1986) • Effort to develop and validate simple methods for use in IEP’s • CASP Presentation, (Herdman, Leaman, and Chartard, 1990) CHRISTO/CSUS/246/2003/CBM

  4. ASSESSMENT NEEDS • Use curriculum • Short in duration • Multiple forms • Inexpensive • User friendly • Show improvement over time • Research based CHRISTO/CSUS/246/2003/CBM

  5. MEASURES IDENFITIED • Considered simple measures in reading, written language and math • Data-based assessment • CBA CHRISTO/CSUS/246/2003/CBM

  6. CHARACTERISTICS OF ALL CBA MODELS • Test stimuli drawn from the curriculum • Repeated testing occurs over time • Useful in instructional planning CHRISTO/CSUS/246/2003/CBM

  7. CBA DIFFER IN TERMS OF: • Long vs. short term goals • Short term objectives • Task analysis • Emphasis on fluency • Use in time series analysis CHRISTO/CSUS/246/2003/CBM

  8. ADVANTAGES OVER OTHER TYPES OF CBA • Focus on end of year goal, not sequential skill analysis • Can evaluate variety of instructional methods • Automatically assesses retention and generalization • Don’t need to alter testing strategies • Avoid issues with measurement shift • Can be normed locally CHRISTO/CSUS/246/2003/CBM

  9. DESCRIPTION OF CBM • Normed assessment from which you can develop local criteria • Dynamic (sensitive) Indicator (correlates) of Basic Skills (not content areas) • Uses local curriculum • Formative evaluation • Use in a problem solving model • Uses at individual, class, and school levels CHRISTO/CSUS/246/2003/CBM

  10. CBM AS DYNAMIC INDICATORS OF BASIC SKILLS • DYNAMIC = sensitive to short term effects in assessing growth • INDICATORS = correlates of key behaviors indicative of overall academic performance • BASIC SKILLS = assess basic skills not content areas • (Mark Shinn, 1998) CHRISTO/CSUS/246/2003/CBM

  11. CBM IS A MEASURE OF PROFICIENCY PROFICIENCY FLUENCY= ACCURACY SPEED MASTERY ACCURACY TEACHING ACQUISITION CHRISTO/CSUS/246/2003/CBM

  12. READING FLUENCY • Indicator of automaticity of important skills • Strong predictor of reading comprehension • Critical element of competent reading • Oral reading a better indicator than silent CHRISTO/CSUS/246/2003/CBM

  13. DIFFERENCES FROM TRADITIONAL MEASURES • Does not try to determine why child is having trouble • But how different from the norm • And is he getting better? CHRISTO/CSUS/246/2003/CBM

  14. SHIFT TO PROBLEM SOLVING FOCUS • Disability vs. handicap • Educational problems as handicaps • Difference between performance and expectation CHRISTO/CSUS/246/2003/CBM

  15. CBM DOES NOT • Give national normative data • Provide broad band information • Is not diagnostic • Although error analysis (or qualitative evaluation of reading) can be used to provide further information CHRISTO/CSUS/246/2003/CBM

  16. VALIDITY AND RELIABILITY • Construct validity • Theory and research support • Concurrent, criterion related validity • Highest for reading • Test/retest reliability • Predictive/concurrent validity (Christo and Southwell, 2001; Good, Simmons and Kame’enui, 2001; Marston, 1989; Shinn, M.R., Good, R.H., Knutson, N., Tilly, W.D., & Collins, V.L., 1992) CHRISTO/CSUS/246/2003/CBM

  17. LEGAL DEFINSIBILITY • Directly from curriculum so social and cultural bias is reduced • Reliability and validity are high • Answers need for instructional utility CHRISTO/CSUS/246/2003/CBM

  18. ACCOUNTABILITY • CBM can document effectiveness by showing change over time • Provides a baseline of performance to determine if related services are leading to change over time • Achievement and accountability decisions are made on basis of classroom performance CHRISTO/CSUS/246/2003/CBM

  19. STAFF ACCEPTANCE • Eliminated jargon and ambiguity • Procedures allowed them to follow intent of law • Testing more relevant • Confidence in test results • Can compare to peers • Improved communication with parents • Motivating to students to see growth CHRISTO/CSUS/246/2003/CBM

  20. RESEARCH BASE • Use in informing instructional decisions • Math (Fuchs, Fuchs, 1989) • Reading (Good, R.H. and Kaminski R.A., 1996). • Use in identifying students at risk of academic difficulties • Use in re-integration decisions (Shinn, 1998) • Language minority students CHRISTO/CSUS/246/2003/CBM

  21. DEVELOPING PROBES • Developed from student’s actual curriculum • Allow for quick administration and scoring • Reading probes • Math probes • Spelling • Written language CHRISTO/CSUS/246/2003/CBM

  22. READING PROBE • One on one administration • Three one-minute tests • Score is number of correct words read • Errors noted • Median score • Grade level reading rates CHRISTO/CSUS/246/2003/CBM

  23. MATH PROBE • Variety of types of problems the student will encounter • Group administration • Three to five minute test • “Correct digits” is the number of digits in the correct place on each problem CHRISTO/CSUS/246/2003/CBM

  24. OTHER SUBJECT AREAS • Spelling – correct letter sequence • Writing – Total words written, words spelled correctly, CHRISTO/CSUS/246/2003/CBM

  25. POCKET CBM DATA • Cover Sheet • Quartile Distribution ( The main graph) • Frequency of Scores (Curriculum Planning) • Percentile Rank • Rank Order • Teacher List • School-wide Progress CHRISTO/CSUS/246/2003/CBM

  26. DETERMINING NORMS • By hand • By spreadsheet • With Pocket CBM • On line programs • Other software CHRISTO/CSUS/246/2003/CBM

  27. CHRISTO/CSUS/246/2003/CBM

  28. STEP 1: INITIAL REFERRAL • Difference between student performance and expectation • Peer or norm referenced • Look for discrepancy ratio or cutoff • Make decision regarding further assessment CHRISTO/CSUS/246/2003/CBM

  29. DETERMINING DISCREPANCY • Discrepancy ratio is greater than 2 • Peer median/Student median • 100/40 = 2.5 • Criterion scores • Well below instructional range • Will vary with grade • Percentile rank CHRISTO/CSUS/246/2003/CBM

  30. CHRISTO/CSUS/246/2003/CBM

  31. STEP 2: INVESTIGATE PROBLEM • How severe is the problem? • What general education services can be used? • Survey level assessment CHRISTO/CSUS/246/2003/CBM

  32. CHRISTO/CSUS/246/2003/CBM

  33. STEP 3: SETTING EXPECTATIONS/ GOALS • Response to intervention model • SST • IEP • Long-term vs. short-term measurement • Determining goal (instructional range of classroom) • Peer referenced • Minimum competence • Expert judgment • Reasonable growth CHRISTO/CSUS/246/2003/CBM

  34. STEP 4: MONITORING PROGRESS • Establish baseline • Plotting growth • Aimlines and trendlines • By hand • Using Excel • Using commercial software CHRISTO/CSUS/246/2003/CBM

  35. STEP 5: DECISION POINT • Is student within instructional range of classroom? (LRE) • Response to intervention model: • More in-depth assessment • More intensive services • In special education process • Exit special education • Re-consider services being provided CHRISTO/CSUS/246/2003/CBM

  36. Referral • Is the student performing significantly different from his/her peers? • 4th grader, Malcolm • Reading 30 cwpm • Class median is 90 • Places him at 20th percentile CHRISTO/CSUS/246/2003/CBM

  37. Investigate Problem • How severe is the problem? • Survey level assessment • In 3rd grade text at 25th percentile • In 2nd grade text at 35th percentile CHRISTO/CSUS/246/2003/CBM

  38. Setting Expectations/Goals • Were does he need to be? • End of year • To show progress • Expected rate of progress for effective intervention. • What do we know about response rates for effective interventions? • Set goal for review. CHRISTO/CSUS/246/2003/CBM

  39. Monitoring Progress • Is Malcolm making acceptable progress • Meeting trendline? • Change goal? CHRISTO/CSUS/246/2003/CBM

  40. CLASS LEVEL • Provide teachers with class profile • Parent reports • Program evaluations • STAR Alternate assessment CHRISTO/CSUS/246/2003/CBM

  41. SCHOOL LEVEL • Screening: acts as safety net • Establish school wide norms • Information to new parents • Retention/summer school decisions CHRISTO/CSUS/246/2003/CBM

  42. WAYS TO IMPLEMENT CBM • One class • One school • A few teachers • Variety of ways to develop norms CHRISTO/CSUS/246/2003/CBM

  43. TWO TYPES OF CBM FOR PRIMARY STUDENTS • DIBELS for prereaders or delayed readers • onset recognition • phonemic segmentation • Oral reading probes for beginning readers CHRISTO/CSUS/246/2003/CBM

  44. DIBELS • Dynamic Indicators of Basic Early Literacy Skills • Skills important to development of literacy • Marker skills that can identify students needing early intervention CHRISTO/CSUS/246/2003/CBM

  45. ARE STUDENTS ACHIEVING FOUNDATIONAL SKILLS? • Good, Simmons, Kame’enui (2001) • Establish benchmarks • Use benchmarks to determine students at risk of not achieving next benchmark • Importance of fluency as opposed to accuracy • Other studies CHRISTO/CSUS/246/2003/CBM

  46. CONTINUUM OF SKILLS (Good, Simmons, Kame’enui) • Kindergarten • Phonological awareness (onset rhyme fluency, phonemic segmentation fluency) • First Grade • Alphabet principle (nonsense word fluency) • Accuracy and fluency with connected text (oral reading fluency) • Second Grade • Accuracy and fluency with connected text (oral reading fluency) CHRISTO/CSUS/246/2003/CBM

  47. CHRISTO/CSUS/246/2003/CBM

  48. CHRISTO/CSUS/246/2003/CBM

  49. CHRISTO/CSUS/246/2003/CBM

  50. IMPLEMENTATION • Decide on Model (individual, class, school, district) • Support from Stakeholders • Develop Timeline • Identify Personnel Resources • Staff Training • Computer Needs • Assessment • Distribution of Results CHRISTO/CSUS/246/2003/CBM

More Related