1 / 33

Research and Evaluation Evidence

Research and Evaluation Evidence. Data from Baton Rouge And Around Louisiana.

arnold
Download Presentation

Research and Evaluation Evidence

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Research and Evaluation Evidence Data from Baton Rouge And Around Louisiana Note: In this document, the Model is called both STEEP and PAM. STEEP is the generic name for the process and PAM is the name associated with the process in 5 Louisiana Parishes. You may even here the term CLASS which is the project title in Baton Rouge. For purposes of understanding the Chat session they are the same and refer to the same basic procedures with slight difference in implementation to fit the needs of each district.

  2. Data On Individual Classes • In next slides data from actual classes • Illustrates typical types of concerns • Classwide Problem • Won’t Do Problem

  3. Bars Represent score for Each Child in Reading This Line is National Norm for this Grade Child who was Referred Light Colored Line is Can’t/Do Won’t Do Assessment Referred child is low but so are many others!

  4. Example of a Child where Can’t/Do Won’t Do Assessment suggested a won’t do problem

  5. Example of a class where all children were below the national standard. This is a classwide problem

  6. Normal Referral Where One Child Stands Out-Lower Than Others Referred Child National Norm

  7. Classwide Problem: Referred Child Does not Stand Out Referred Child National Norm

  8. Won’t Do Problem: Improves if it Matters Can’t Do/Won’t Do Score Referred Child National Norm

  9. Typical Class with 2 Children Not Making Progress STEEP can be used for 3-4 times per year to MONITOR PROGRESS of the whole class and identify children who need help EARLY before they get referred. These two bottom child are not making good progress like the rest of the children—they need intervention

  10. Evaluation of the Process—What are the Important Questions? • How sure can we be that a child should not be evaluated for special education? • Is this more efficient than the old way? • Remember this is a Screening Process to See who Gets a Full Evaluation

  11. How accurate is STEEP?Teachers accurate only 1 in 5 times! Is teacher referral based more on perception than fact STEEP was correct in identifying kids who need to be referred 53% of the time compared to the teacher being correct only 19% of the time. Both the teacher and STEEP were very accurate when saying a child did not have a problem

  12. Comparison of errors by referral source

  13. How acceptable did teachers find this process at XXX school? Of 5 teachers, All but 1 found process acceptable.

  14. ITBS—Useful for School Wide Planning With Principals STEEP probes correlate highly with Iowa and Brigance.

  15. In Next Slides you will see Baton Rouge Data • Previous data were collected as part of a large research project. • Next slides reflect efforts in Baton Rouge during 2000-2001 school year.

  16. Training • 20 pupil appraisal personnel trained to • Conduct classwide probes • Score classwide probes • Conduct can’t do/won’t do assessments • Implementation at • Target Schools • Home schools

  17. Activities • Three types of activities were conducted • Assessment • Of student academic performance • Evaluation • Of the data collected • Intervention • Based on needs identified through assessment

  18. Assessment • Assessments were conducted • School wide • Class wide • For individual referrals • At Pilot elementary Schools • Three school wide assessments distributed across the school year and numerous class wide assessments: Reading, Math, Writing STEEP constantly monitored progress of all children to make sure they are learning OK

  19. Evaluation • Evaluation of assessment data • To identify individuals or groups in need of intervention. • As progress monitoring to determine intervention effectiveness • Data analysis of probe validity • Creation of school based benchmarks

  20. Intervention • Interventions were conducted for • School wide problems • Polk writing intervention • Class wide problems • Polk reading intervention • For individual student problems • Several interventions conducted by school based consultants along with collaboration with the LSU Behavior Intervention Team (N=40 approx)

  21. Principal used our data to start and monitor school wide writing program: notice improvement from first testing to second testing relative to national gold standard

  22. Reading Progress over the year for one class

  23. Validity of Probes for EBR • 4th grade • Correlations with (Iowa Test of Basic Skills} • Math- ..71 significant at the .01 level 2-tailed • Reading- .75 significant at the .01 level 2-tailed

  24. Individual Interventions • Approx 40 interventions • Data on Approximately 20 • 11 Days Avg Length of Intervention • All children in Frustration Range at beginning of intervention • All children improved with about 80% coming into instructional range.

  25. Computer Delivered Intervention Reading • Issues of Teacher Delivery • Sometimes not done • Sometimes done incorrectly • Computer • Does it accurately • Does it Often

  26. Number of New Words Learned Each Day Across 3 Stories Results of computer administered intervention.

  27. Avg Number of New Words Learned Each Day for All Children Across 3 Stories

  28. Summary of Computer • Effectiveness equivalent • Ease of use better • Integrity of delivery far better

  29. Benchmarking • A benchmark can be a goal or standard. • How is standard set? • By examining data. • Iowa or LEAP scores associated with an easily available measure such as probes • What probe score do you need to “pass” the LEAP?

  30. From Data this Year we Know • 100% of Kids who read more than 78 words in one minute our Reading Probe passed LEAP (state test). • Math: 92% of children who scored more than 46 correct on our probe passed the LEAP.

  31. Benchmarking • Goal: Every child by February of 2002 will score above 78 on Reading Probes • Goal: Every child by February of 2002 will score above 46 on Math Probes

  32. Contact for Additional Information Baton Rouge • Aeneid Mason, Coordinator Pupil Appraisal • 225-388-8784 • amason@isis.ebrps.subr.edu • Mailing Address • EBR Schools, 6550 Sevenoaks • Goodwood Center Ave. • Baton Rouge, LA 70806

  33. Contact for Additional Information LSU • Joe Witt LSU • 225-388-8784 • Web Site • http://bitwww1.psyc.lsu.edu/ • STEEP Manual • Functional Assessment Book • SoprisWest.com • Mailing Address • Dept of Psychology,LSU, BR, LA 70803

More Related