1 / 35

Paul Baumann, Ph.D. Director, National Center for Learning and Citizenship Shelley Billig, Ph.D.

Learning From the Past : Using the 2009 Learn and Serve cluster evaluation to develop the field’s understanding of service-learning implementation and outcomes. Paul Baumann, Ph.D. Director, National Center for Learning and Citizenship Shelley Billig, Ph.D.

osmond
Download Presentation

Paul Baumann, Ph.D. Director, National Center for Learning and Citizenship Shelley Billig, Ph.D.

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Learning From the Past: Using the 2009 Learn and Serve cluster evaluation to develop the field’s understanding of service-learning implementation and outcomes Paul Baumann, Ph.D. Director, National Center for Learning and Citizenship Shelley Billig, Ph.D. Vice President, RMC Research Corporation, Denver Susan Abravanel Vice President of Education, Youth Service America (YSA) Teri Dary Co-Chair, National Coalition for Academic Service-Learning 2012 IARSLCE Annual Conference Baltimore, MD September 25, 2012

  2. Introduction Session Goals • Provide an overview of three years of findings for three participants in the 2009 LSA cohort (NCLC, YSA, Wisconsin DPI) • Begin to examine how implementation strategy might play a role in program outcomes • Begin to develop cross-cutting findings.

  3. LSA 2009 Cohort LSA 2009 cohort initially included a number of states and national programs: • States: AZ, HI, IA, IL, LA, MI, MN, OH, and WI • National Programs: NCLC, YSA (focus on STEM, middle schools, high-poverty schools) Purpose: Implement and evaluate service-learning and Pre-K 12 schools Cohort members agreed to use common quasi-experimental evaluation design and use the same survey scales to measure a set of core common outcomes: • Academic engagement • Academic performance • Dropout prevention • Acquisition of 21st century skills Coordinated evaluation would help to address several shortcomings of S-L research (Billig, 2001; 2008; 2011; Bradley, 2005): • Lack of methodological rigor: Quasi-experimental or experimental design necessary to establish causality and determine effect sizes • Lack of adequate sample: Large, diverse sample necessary to establish more generalizable findings across states and various student populations.

  4. Research Questions • What was the impact of service-learning participation on students?: • Academic engagement; educational aspirations; acquisition of 21st century skills; civic dispositions; interest and abilities in STEM; and interest in careers that require STEM skills • Academic achievement, dropout rates, attendance, and discipline referral rates • What program design characteristics influenced outcomes? • What is the relationship between each programs’ theory of action and implementation strategy, and the student outcomes for that program? • What conclusions, if any, can we draw by looking across the evaluation findings of the three programs?

  5. Methodology • LSA sub-grantees from each state were required to participate if they received a certain level of funding. • Each recruited a matched-comparison site for every two programs (e.g., one middle school site for every two middle school sites served). • Matches were based on demographics (roughly the same SES and racial/ethnic percentages) and achievement scores (scores on the state assessment in ELA and math) coming to the school year.

  6. Methodology • In year one, a retrospective pre/post-student survey was used; in years two and three, a traditional pre/post-student survey was used. • Teachers and community members answered a post-only survey. • Achievement data were collected from the sites or state, including test scores in language arts, math and science, attendance, disciplinary data, and graduation/dropout rates.

  7. Schools of SuccessNational Center for Learning and Citizenship Theory of Action: • Guide schools from policy to action, and action to policy • Encourage high-quality practice and sustainability through policy and district/building leadership • Encourage policy and district/building leadership through models of high-quality practice. Expectation for Change: • Changes to district/school policy (broadly conceived) • District/schools will begin new or grow existing service-learning programs • Increase middle school students’ engagement in STEM related classes.

  8. Schools of SuccessNational Center for Learning and Citizenship Context: • Nine middle schools • CA, KY (3), MI, MS, NY (2), PA • Two Charter Schools (MI, PA) • Three KY schools, all in one district • Mixture of rural and urban • All high poverty ( > 50% FARMS) • Some high performing, some low performing • Several experienced with service-learning (CA, PA); the remainder beginning with service-learning.

  9. Schools of SuccessNational Center for Learning and Citizenship Implementation Strategy: • Role of leadership • Recognition of leadership at multiple levels of the school • Worked primarily with leaders from district/building level, as well as key teacher leaders. • Professional development • Three days of training annually • Focused both on implementation (how to do) as well as policy (how to encourage, lead, fund, etc.). • Sustainability • Focus on five elements that support sustainability: • Vision and leadership • Curriculum and assessment • Community and school partnerships • Professional development • Continuous improvement.

  10. Schools of Success Findings Teacher Perceptions of Service-Learning Quality in Their Classrooms (N = 6)

  11. Student ratings of quality

  12. Teacher Perception of Impacts

  13. Student Perceptions of Skills AcquiredN=319 Student Perceptions of Skills Acquired Through Participation in Service-Learning (N = 319)

  14. Moderators of Outcome *p< .05, **p< .01.

  15. YSA STEMester of Service Target: • Teachers and classes in middle schools • 19 states most seriously challenged by HS graduation rates • ≥50% free/reduced lunch demographic

  16. Whatis a STEMester of Service? YSA Support: • Training and resources • Unique partnership opportunities • Ongoing personalized consultation • Networking • Rigorous evaluation component

  17. Whatis a STEMester of Service? Content: • Focus on incorporating science, technology, engineering or mathematics content learning • Service project targeting environmental or disaster preparedness issues • Career and workforce readiness component • A continuous Semester of Service™ strategy

  18. Semester of Service™ • Integrated service and learning activities • 12-to-14 week “semester” • National/international service days • Design based on the “IPARD/C” model • www.YSA.org/resources

  19. YSA Findings 4-point scale where 1 = Strongly Disagree, 2 = Disagree, 3 = Agree, and 4 = Strongly Agree. *p < .05, ***p < .001.

  20. YSA Findings Student Perceptions of Skills Acquired Through Participation in Service-Learning (N = 1,055)

  21. YSA Findings Moderators of Outcomes: Student Ratings of Program Quality ***p < .001. [1]

  22. YSA Findings • You can see from the previous slides that high-quality service-learning is very effective… but regular service-learning has a significant but weak effect. • Most of our field has a .03 effect size… much too low. • How can we improve?

  23. 4-point scale where 1 = Strongly Disagree, 2 = Disagree, 3 = Agree, and 4 = Strongly Agree. *p < .05, ***p < .001.

  24. Student Perceptions of Skills Acquired Through Participation in Service-Learning (N = 1,055)

  25. Moderators of Outcomes: Student Ratings of Program Quality ***p < .001. [1]

  26. Wisconsin Department of Public Instruction Theory of Action: • Create infrastructure of support • Build capacity for quality and sustainability through district infrastructure and leadership • Support quality practice through cascade professional development. Expectation for Change: • Districts will infuse service-learning as a strategy to achieve district goals • District will build an infrastructure of support and leadership to guide practice in local schools.

  27. Wisconsin Department of Public Instruction Context: • Up to 10 districts funded per year for a maximum of 6 years • Up to $20,000 grants per year • Fundable expenses must support primarily infrastructure, leadership, curriculum integration, and professional development. • Geographic and demographic diversity • Collaborative network of support.

  28. Schools of SuccessNational Center for Learning and Citizenship Implementation Strategy • Leadership • Build capacity for leadership at teacher, school, and district levels • Increase capacity for teachers to serve as leaders within their school/district and at the state level • Professional development • Initial training by DPI during first semester of funding • Ongoing professional development through in-person and online environments • Sustainability • All grants must demonstrate progress on five elements that support sustainability • Vision and leadership • Curriculum and assessment • Community and school partnerships • Professional development • Continuous improvement

  29. Wisconsin Findings Will add this week.

  30. Changes Over Time in Service-Learning and Comparison Students Ratings of Community Attachment (N = 1,174)

  31. Percentage of Service-Learning Students Reporting Skill Gains, Grades 6-12 (N = 823)

  32. Questions For Discussion • What program design characteristics influenced outcomes? • What is the relationship between each programs’ theory of action and implementation strategy, and the student outcomes for that program? • What conclusions, if any, can we draw by looking across the evaluation findings of the three programs? • What other questions do you have for us or for discussion here or in the future?

  33. Thank You! • To continue this conversation, please use the conference proceedings. • Please contact us if you have additional comments or questions. • Paul Baumann: pbaumann@ecs.org • Shelley Billig: billig@rmcdenver.com • Susan Abravanel: sabravanel@ysa.org • Teri Dary: teri.dary@dpi.state.wi.us

More Related