1 / 39

The Research Design

The Research Design. Research for Better Schools Philadelphia, PA Jill Feldman, Ph.D., Director of Evaluation. What research questions will we ask about MSRP impact ?. Does MCLA effect core subject teachers’ knowledge and use of research-based literacy strategies?

pier
Download Presentation

The Research Design

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Research Design Research for Better Schools Philadelphia, PA Jill Feldman, Ph.D., Director of Evaluation

  2. What research questions will we ask about MSRP impact? • Does MCLA effect core subject teachers’ knowledge and use of research-based literacy strategies? • What are the separate and combined effects of MCLA and Read 180 on students’ reading achievement levels, especially students identified as struggling readers? 3. What are the separate and combined effects of MCLA and Read 180 on students’ achievement in core subjects, especially students identified as struggling readers?

  3. What outcome measures will we use? • Iowa Test of Basic Skills (ITBS) • Vocabulary, Fluency, Comprehension • TCAP • Reading, Social Studies, Science, Mathematics • Gateway and End of Course Assessments • ELA, Mathematics, and Science

  4. What research questions will we ask about MSRP implementation? 1. To what degree do the implemented MCLA & R180 treatments match the intended program standards and features? 2. What contextual district and school level factors may be influencing the implementation of MCLA & R180? 3. How do the professional development events, materials, or structures present in the control schools compare to what is present in the treatment schools?

  5. Research Design for MCLA • 4 matched pairs of schools (N=8) randomly assigned to treatment (MCLA) or control (no MCLA) condition • Content area teachers in cohort 1 to participate in MCLA for Years 1 and 2 • Control group teachers (cohort 2) to participate in MCLA in Years 3 and 4

  6. MCLA: Random Assignment of Schools

  7. MCLA: Exploring Efficacy Attempts to address questions about whether or not MCLA can work • Depends upon rapid turnaround of data collected • Relies upon formative feedback to guide program revisions • Requires close collaboration among project stakeholders • To develop measures • To share information and data • To communicate regularly about changes and challenges • To troubleshoot and cooperatively address challenges

  8. Research Design for Read 180TM • Random assignment of “eligible” students enrolled at 8 SR schools, where eligibility means: • No prior participation in READ 180TM • Two or more grade levels behind in reading • Scores in bottom quartile on state assessment (TCAP) • READ 180TM is the treatment • Counterfactual (business as usual*) is the control

  9. Read 180: Random Assignment of Students

  10. Read 180: Exploring Effectiveness Attempts to address questions about whether or not Read 180 will work… • Provides evidence about what happens when R180 is implemented “off the shelf,” (without formative eval support) • Requires MCS to set aside local need for feedback to address questions of importance to field • Establishes a one-way firewall between MCS and RBS

  11. Please review the safety card in the seat pocket… • Balance local knowledge of students’ needs within the identified “eligible pool” without creating selection bias • Address high rates of student mobility • Accurately describe the counterfactual • Obtain parental consent (and students’ assent) to administer the ITBS • Design procedures to prevent crossover • Deal with (inevitable) startup delays

  12. Air Traffic Control: Did Random Assignment Work?

  13. Are the student groups comparable? • Students eligible for READ180 ™: N = 2,277Total students in 8 SR schools: N = 6,170 Students eligible as % of total: 36.9% • No differences in race, gender, ethnicity, or poverty level between conditions • Higher % of ELLs in control group (87 of 1,337 students, or 6.5%) than in R180™ (35 of 940 students, or 3.7%) • Higher % of Sp Ed 8th graders in R180™ group (28.2%) vs control (20.9%)

  14. What, how, and from whom should data be collected? • Use multiple measures and methods • Interview developers, instructors, coaches, & principals • Surveys of teacher knowledge and attitudes • Focus group discussions with teachers • Evaluator observations of PD sessions • Evaluator observations of classroom implementation • Use data to challenge/confirm findings from single sources • Share findings with key stakeholders to determine whether: • data collected are appropriate to support decision making • evaluation findings reflect actual experiences • revisions to the logic model, IC map, and/or instruments are needed

  15. Helen/Bob’s piece here…

  16. The Flight Plan The MCLA Program Logic Model

  17. Memphis Content Literacy Academy Evaluation Logic Model Outputs Long-term Outcomes Short–term Outcomes Inputs: Funding, staff, curriculum resource center, facilities, incentives, research materials PPrincipals # hours of Principal Fellowship participation # of MCLA events attended Teachers # of hours of MCLA training attended # hours of coaching (contacts) # of CAPS implemented? Observed? videotaped? # of new lesson plans integrating literacy in content area lessons # and type of materials checked out of CRC Students # classes taught by teachers participating in MCLA # MCLA strategies students learn # (freq?) of MCLA strategy use PPrincipals Awareness of and interest in staff implementation of MCLA concepts and strategies Teachers Increased knowledge of MCLA strategies Improved preparedness to use research-based literacy strategies to teach core academic content Increased use of direct, explicit instruction to teach reseach-based comprehension, fluency, and vocabulary strategies in content area classes Integrated use of MCLA strategies to support development of content literacy Students Increased familiarity with and use of MCLA strategies when engaging with text Increased internalization of literacy strategies Increased interest in school/learning • Principals • Improved school climate • School-wide plans include focus on content literacy • Improved instructional leadership • Teachers • Increased effectiveness supporting students’ content literacy development • Continued collaboration among community of teachers to develop and implement CAPs • Students • Improved reading achievement and content literacy: • 10% increase in students scoring proficient in Reading/LA and other subject areas of TCAP • mean increase of five NCEs on ITBS (comprehension? vocab?) Activities Principals Attend four three-hour principal fellowship sessions each year for two (or four?) years Participate in motivational, recruitment and celebratory events Discuss MCLA at faculty meetings Conduct walkthrough observations Provide opptys for teacher collab Allocate space for CRC materials Teachers Attend # weekly MCLA training Develop and implement 8 CAPs per year? Meet with coaches for feedback to improve implementation of MCLA strategies Integrate use of leveled texts to support development of content literacy among struggling readers Students Use MCLA strategies to read/react to content related text (independently? In collaborative groups? Neither? Both?) Higher Quality Teaching & student achievement

  18. Defining what will be evaluated Developing the MCLA Innovation Configuration (IC) Map • Involve diverse groups of stakeholders • The development team • The implementation team (MCS administrators & coaches) • Experienced users • Evaluators • Identify major components of MCLA • Provide observable descriptions of each component • Describe a range of implementation levels

  19. MCLA: The Conceptual Framework

  20. Wheels Up: Resisting Premature Use of “Auto Pilot” With the IC map guiding development, the following measures were designed to collect data a/b MCLA implementation: • Surveys • Teacher knowledge about & preparedness to use MCLA strategies • Teacher demographic characteristics • Teachers’ MCLA Feedback • Interviews • Principals, coaches, development team, and MCS administrators • Teachers Focus Group Discussions

  21. Operationally defining components:“Job Definition”

  22. Aligning the IC Map and Instrument Development: “Job Definition” – Teacher Survey

  23. “Job Definition” - Principal Interviews

  24. Where the rubber hits the runway… Classroom Implementation

  25. Operationally defining components: Implementation of Lesson Plans

  26. Implementation of lesson plans:Collecting classroom observation data

  27. Implementation of lesson plans:Collecting classroom observation data

  28. Please remain seated with your seatbelts fastened… • Timely turnaround of data summaries • Team meetings to debrief/interpret findings • Testing what you think you “know:” • Productive (& challenging) conversations • Data-driven decision making • Taking Action • Following up (ongoing formative evaluation feedback)

  29. Elizabeth’s piece here

  30. Complimentary Refreshments:CRC Materials

  31. Complimentary Refreshments:CRC Materials

  32. Activity Activity Frequency Frequency Percentage Percentage Coach’s administrative tasks Coach’s administrative tasks 1358 1358 32.2 32.2 Conferencing with teachers Conferencing with teachers 716 716 17.0 17.0 Observation Observation 698 698 16.5 16.5 School administrative tasks School administrative tasks 339 339 8.0 8.0 Collaborative teacher support Collaborative teacher support 330 330 7.8 7.8 Coach’s professional development Coach’s professional development 303 303 7.2 7.2 Assisting teachers in class Assisting teachers in class 138 138 3.3 3.3 Striving Readers evaluation tasks Striving Readers evaluation tasks 138 138 3.3 3.3 Helping teachers prepare Helping teachers prepare 71 71 1.7 1.7 Modeling Modeling 59 59 1.4 1.4 Videotaping/other Videotaping/other 73 73 1.7 1.7 Percentage Distribution of Planned Coaching Activities Logged in Year 1 (N=4,233 entries logged)

  33. Ground Transportation: The Coaching Role Trust b/w coach and teacher(s) is critical: • To provision of CAP implementation support • Pre-conference meeting • CAP Observation • Co-teaching; modeling • Videotapes for use to train teachers, coaches, evaluators • Post observation conference • To effective and strategic selection of CRC & supplemental resources

  34. Avoiding Wind Shear… Team’s unwavering commitment to helping teachers support the success of struggling adolescent readers sum > individual parts

  35. …and we have the data to prove it!

  36. Across grade levels, the picture is the same…

  37. 8th Graders’ Reading Levels

  38. School-wide comparisons with schools nation-wide

More Related