1 / 22

Research-Based Math Instruction and Intervention

Research-Based Math Instruction and Intervention. Scott Methe, Ph.D. University of Massachusetts at Boston Assistant Professor of School Psychology scott.methe@umb.edu 617-287-3167. Overview of Today’s Talk: Five Key Points.

tassos
Download Presentation

Research-Based Math Instruction and Intervention

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Research-Based Math Instruction and Intervention Scott Methe, Ph.D. University of Massachusetts at Boston Assistant Professor of School Psychology scott.methe@umb.edu 617-287-3167

  2. Overview of Today’s Talk: Five Key Points • Research-based math instruction and intervention is a hot topic. Schools and children struggle to teach and learn. • To be “research-based,” a program must have evidence supporting the idea that the use of the program results in student achievement. • Therefore, the terms evidence- and scientifically-based are more understandable and clear. • We’ll spend time today disentangling this issue of evidence-based, what it means for you and struggling kids, and how to locate, evaluate, and implement programs that work. • Peppered into the conversation is a focus on habits of mind necessary for implementing effective interventions in any content area.

  3. What is “Research-Based?” • This term has been used interchangeably with “scientifically-based” and “evidence-based.” • “Evidence-based” or “scientifically-based research” are better terms. • Why? Because research means many things and incorporates many methods. • Scientifically-based programs have internal validity (i.e., this causes that) – which means that alternative reasons for student growth have been ruled out and the cause of growth has been isolated to the program. • “Evidence based” can mean three things (Stoiber& Desmet, 2010): • An “evidence based intervention (EBI),” which has been directly tested in rigorous studies and demonstrated to work with a similar population / sample as the one where it will be used. • Evidence based practice (EBP) is not necessarily a program but a variety of things that are done with and to the program that have not been directly tested but many components have a strong basis in research. • Evidence base applied to practice (EBAP) is when a program has not been demonstrated to work and may have questionable origins, but the community supports it and therefore is ethically responsible for its evaluation.

  4. The “Evidence-Based” Mindset • School administrators and key decision makers must engender a culture of scientific thinking and experimentation – using tools and techniques such as curriculum-based measures – to evaluate key outcomes and link these outcomes with teaching approaches. • For most students, it is very difficult or unnecessary to link input to output • Thankfully, the students who are progressing at average to above average rates don’t necessarily need this standard of evidence. • Therefore, scientific thinking is necessary for problem-solving. • Therefore, thinking this way means thinking about specific students and specific problems rather than globally. • Scientific thinking is very similar to TEACHER EFFICACY – the belief that what you do as a teacher can cause a change in an individual student rather than the faulty belief that for a student to change, they need to “want to change.” • A substantial body of research by Anita Woolfolk demonstrates that teacher efficacy is the root cause of successful schools. • An excellent resource for school teams is “Evaluating Educational Interventions” by T. Chris Riley-Tillman and Matthew Burns, available through Guilford Press. • It discusses techniques for evaluating school interventions and programs for individual students and linking student progress with the intervention / instruction.

  5. See the resources on the wiki: “science_research” and“science-pseudo”

  6. Instructional Programs for Mathematics that Work and How to Find Them • What Works Clearinghouse (http://ies.ed.gov/ncee/wwc/) • Part of the Institute for Educational Sciences • At this site, find “Intervention Reports.” Using the glossary, WWC defines an intervention as “an educational program, product, practice, or policy aimed at improving student outcomes.” • This definition is VERY different from my definition: “something you do that causes a change to a fixed trajectory of student or teacher behavior.” • We can AIM at improving student outcomes until the cows come home but actually improving outcomes is a much different story and requires a level of evidence and rigor that many people shy away from. • This talk, therefore, and picking programs is only one small piece of the science of implementation.

  7. Intervention Reports on Math Outcomes • The key to understanding what works is understanding the Improvement Index: • Defined as: “The expected change in percentile rank for an average comparison group student if the student had received the intervention.” • I.E., how much Johnny would have changed, using a standard ruler, if he was part of the group that received this intervention / program. • It is hypothetical because adopting a program is a hypothesis: you buy-in or actually buy programs because you are GUESSING that they will work in your school. • Therefore, implementation integrity is key

  8. Use “critical_components_checklist” from the wiki

  9. Improvement Index

  10. Improvement Index • In the example on the previous slide, the estimated average impact of the intervention is a 0.4 standard deviation improvement in reading scores. • An average comparison group student (at the 50th percentile) would be expected to have scored 0.4 standard deviations above the mean if he or she had received the intervention, or at the 66th percentile of students. • The resulting improvement index is +16, corresponding to moving performance for the average student from the 50th to the 66th percentile of the comparison group distribution.

  11. What Works in Mathematics • The next three slides summarize interventions that work, don’t work, and could work if there were more evidence. • The programs listed that work and do not work have medium to large or large bodies of evidence supporting the conclusions. • What works? Looking for percentile gains larger than 5-10% and a determination of potentially positive (+) or positive (++). • What doesn’t work? Anything below the cutoffs above. • What might work? Anything lacking evidence (which is a lot!).

  12. Math Programs That Work • Elementary • None notable • Middle • I CAN Learn Pre-Algebra (core; 5th and 6th grades) • High School • None notable • Major obvious conclusion: much more quality research to practice partnerships need to happen. • Also: THERE IS SO MUCH WE DON’T KNOW!

  13. Math Programs That Don’t Work • Elementary • Saxon Elementary School Mathematics (core) • Scott Foresman / Addison Wesley (core) • Accelerated Math (Supplementary) • Middle • University of Chicago Math Project (Algebra; Core) • Could also go under “might work,” as body of evidence is low. • High School • I CAN Learn Algebra (Core) • Cognitive Tutor (Supplementary)

  14. Math Programs That Might Work (if there were more / quality studies) • Elementary • Bridges in Mathematics (Core) • Kumon Math (Core) • TERC Investigations (Core) • Middle • Singapore Mathematics • Odyssey Math • Destination Math • MathThematics • Accelerated Math • High School • University of Chicago School Mathematics Project (6 – 12 Integrated Program) • Caution: extent of evidence is small but very promising (23% improvement). • Accelerated Math

  15. Ethical Violations? Lies? False Claims? • When looking at WWC Intervention Reports, programs like TERC Investigations and Mathematics in Context, among others, had many studies cited on their program websites (i.e., over 20, up to 60), but it is interesting to note that none of these studies met criteria for inclusion. • Some aspect of the study made it unable to be replicated or validated (i.e., it wasn’t actually a study, lack of peer review, not a treatment study, carried out by company who sells product, etc.). • It seems that one could conclude that the program creators are trying to fool the public. • Watch out for programs that have a lot of studies but none that meet evidence standards – these appear to be especially suspect and almost intentionally misleading.

  16. What Scientific Research Base??

  17. Wait a Minute, I Know the “Bad” Program Works!! • Many high-quality studies are often carried out by program publishers… • …but the extent to which the researchers employed by the publishers can be compensated for the program’s success creates a conflict of interest that the WWC will not (and should not) entertain. • Therefore, independent studies often carried out at Universities are critical for the success of educational programs and the children who receive these “treatments.”

  18. What to do if we are using a weak program? • Remember, there are TWO TYPES of weak programs: • those that do not work after sufficient testing and • those that could work but lack an evidence base. • If it’s a program listed under “doesn’t work” then work toward scrapping it ASAP. View the WWC evidence for yourself. • Do NOT implement a program that doesn’t work with more integrity !! • If you believe in it, but it lacks evidence, partner with a local university that will engage in a rigorous study that intends to do what the WWC is looking for – that is, garner evidence that the program is responsible for student growth. • Ensure you have an RTI system in place because RTI evaluates weak programs in practice.

  19. Evaluating Instructional Programs • Start with screening in general education. • Listen to Tom Jenkins! • If you’re not screening, you’re not intervening! • Using the National Center for Response to Intervention (rti4success.org), choose the right tools for your purposes and screen, instruct, and monitor progress. • Start with screening the entire school and establishing two key pieces of an RTI model: • Instructional intensity hypotheses • Progress monitoring schedules • USE the right CBM tools to REPLACE other assessment instruments. substantial predictive validity and NCRTI tells you which

  20. Evaluating Instructional Programs Using a Three-Tiered Model • The next two slides show the distribution of students falling at low (green), medium (yellow), and high (red) risk for not meeting the end of the year goals. • The percentages you see are based on cutoff scores on a valid and reliable screening measure (see rti4success.org). • Example is early numeracy in Kindergarten, but logic holds for other grades and levels. • Immediately after screening, for students in these groups, you must think about (a) allocating high-quality instruction and the intensity of instruction and (b) scheduling progress checks. • For a more in-depth reading, see the handout on the wiki “shapiro_roi” (ROI means rates of improvement).

  21. Instructional Intensity Following Screening Date Immediate, high quality and high intensity Core + supplementary Core only

  22. Progress Monitoring: Schedules & Materials Weekly – Highest Instructional Level Biweekly – Grade Level 3 times / year

More Related