1 / 31

Evaluation of Michigan Child Care Expulsion Prevention Program (CCEP), 2007-2010

Evaluation of Michigan Child Care Expulsion Prevention Program (CCEP), 2007-2010. Michigan State University October 27, 2010. Rosalind H. Kirk a John S. Carlson a Laurie A. Van Egeren a Holly Brophy-Herb a Stacy L. Bender a Betty Tableman a Mary A. Mackrain b Deb Marciniak c

hide
Download Presentation

Evaluation of Michigan Child Care Expulsion Prevention Program (CCEP), 2007-2010

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation of Michigan Child Care Expulsion Prevention Program (CCEP), 2007-2010 Michigan State University October 27, 2010

  2. Rosalind H. Kirka John S. Carlsona Laurie A. Van Egerena Holly Brophy-Herba Stacy L. Bendera Betty Tablemana Mary A. Mackrainb Deb Marciniakc Sheri Falvayd aMichigan State University bMichigan Child Care Enhancement Program cMichigan Public Health Institute dMichigan Department of Community Health

  3. Agenda • CCEP’s research questions (child, provider, program, family, CCEP process & fidelity) • Evaluation approach • Evaluation strategies • Strategies – strengths and challenges • Use of CCEP evaluation results

  4. Child Care Expulsion Prevention Program (CCEP), Michigan • Began in late ‘90s • Initiated by MDCH, supported with funding from MDHS • Plans for state-wide coverage • At time of evaluation, 16 programs covering 31 out of 83 counties • Approx. 500 - 600 children per year. • Programmatic consultation also provided. • After T1 data collection ended in 2009, focus of CCEP changed to 0-3 yrs. • Along with many other MI programs funding ended on 30 September 2010

  5. Research questions Child Outcomes (John) • Does the severity of children’s challenging behavior decrease from the onset of CCEP services to the conclusion of services? • Does children’s social and emotional health increase from the onset of CCEP services to the conclusion of services? • Does the impact of services on children’s behavior last past services? • Do children receiving CCEP services successfully stay in child care vs. being expelled?

  6. Research questions Parent outcomes (Holly) 5. Do subjective feelings of parental competence in dealing with their child’s challenging behavior increase as a result of CCEP services? 6. Are families able to consistently attend work or school?

  7. Research questions Child Care Provider outcomes(Laurie) 7. Is the childcare provider better able to recognize early warning signs of social and emotional challenges in infants, toddlers, and preschoolers? 8. Is the child care provider better able to manage challenging behavior in the child care setting, with all children?

  8. Research questions Child Care Program outcome (Ros) 9. Has the social and emotional quality of the child care setting receiving CCEP services improved?

  9. Research questions Program Fidelity (Laurie) 10. What is the fidelity of the child and family consultation process among CCEP programs? 11. What is the fidelity of the programmatic consultation process among CCEP programs?

  10. Evaluation approach • Collaborative and consultative • Built upon existing systems • Mixed method – mainly quantitative, some qualitative

  11. Four overall strategies • Cross-sectional (formative): Consultant survey • Longitudinal study (mainly summative): Pre-post data + 6 month follow-up from intervention group using measures of child, parent, provider outcomes • Quasi-experimental comparison study(summative): Comparison group with pre-post data matching longitudinal intervention group • Case studies (formative): Perceptions of experiences with CCEP based on interviews.

  12. 1. Cross-sectional strategy: strengths On-line survey of consultants on participation in CCEP and delivery of service, including compliance with six CCEP cornerstones • ‘Snap-shot’ of program and processes based on perceptions of consultants and administrators • Electronic surveys are accessible, flexible, user friendly and can be quick to analyze • Very collaborative with CCEP in design, data collection, interpretation • Provided a wealth of information for program improvement, etc. • Collaboration provided opportunity to share expertise & help develop CCEP internal monitoring systems

  13. Cross-sectional strategy: potential challenges • Potential factors affecting response rate: organizational change, personal views about evaluation, stress levels, vacations, sickness, staff turnover, workload, length of survey etc, • Anonymity can mean that survey data more likely to be accurate but non-respondents cannot be targeted to increase response rate.

  14. Cross-sectional strategy: survey of consultants, 2008 (N =29)

  15. Cross-sectional strategy:survey summaries/ research briefs • 1. Informing Providers About CCEP Services • 2. Child and Family Consultation Processes • 3. Programmatic Consultation Processes • 4. Reflective Supervision • 5. Group Training and Individual Coaching of Providers and Parents • 6. Consultants: Experience, Job Satisfaction, and Organizational Support • 7. The Most Important Things Consultants Do • 8. Collaboration with Michigan Child Care Coordinating Council, MSU Extension, and the Great Start Collaborative • 9. State-Level Training and Technical Assistance Available at http://outreach.msu.edu/cerc/research/ccep.aspx

  16. Cross-sectional strategy: other survey results • Preventing Children’s Expulsion from Childcare: Variations in Consultation Processes in a Statewide Program Poster and Survey summaries/research briefs at SRCD conference (2009) View at: http://outreach.msu.edu/cerc/

  17. 2. Longitudinal strategy - strengths • Able to assess child, parent, provider and program outcomes pre (T1) and post (T2) and if these were sustained over 6 months(T3). • Collaborative – state and local e.g. consultation on selection, organization and use of measures; attendance at monthly meetings; electronic Q & A; personal contacts between consultant and MSU team especially with new staff; collaborative troubleshooting at state level. • Built on existing systems so incorporated measures already used by consultants e.g. DECA

  18. Longitudinal study sample size Sample sizes included in analyses varied depending on the quality of the data collected

  19. Children & Families intervention sample (N=361)

  20. 3. Quasi-experimental strategy • Includes collection of matching data from a sample of children exhibiting challenging behaviors but resident in a county where CCEP unavailable. Need to create a matched sample (N=86). • Enables comparison with CCEP intervention group beyond maturation changes • Ongoing challenges (resources - time, staff, organization, incentives) for recruiting and participation of comparison group but not resident in county with CCEP • Limitations –missing data, multiple raters, reliance on self-report measures and interviews, how representative was the intervention group who participated in the evaluation, were comparison families enough like CCEP group even with matching? what other services, if any, were comparison families receiving in their own counties? Did counties with CCEP differ from counties without?

  21. Does consultation make a difference to parents? • Awaiting final results, (on child, parent, provider, program outcomes and perceptions of effectiveness & relationships). With qualifications, trends prior to the final analyses have indicated that: • Both parental competence increased and stress reduced more among parents who used consultation services. • There was strong/high levels of satisfaction with the perceived consultation process, its’ effectiveness, and acceptability by both parents and providers.

  22. Interim results (N=129) - Change in child outcomes after early childhood mental health consultation (see link to poster) Before taking dosage of CCEP into account, raw parent & provider data showed: • Both CCEP and comparison children showed significant improvements in behavior problems and positive behaviors over the study period. • For parent report in the CCEP group, attention problems and functional communication continued to improve 6 months after consultation; most others remained level. Are higher doses of consultation linked to greater improvement in child challenging and positive behaviors compared to lower doses? • After taking satisfaction with CCEP into account, more hours of consultation with providers (but not parents) predicted increases in provider reports of some positive behaviors. • At 6-month follow-up, more hours of provider consultation was linked to continued improvements in parent-reported attention problems. • Gains made in behavioral concerns and functional communication were not sustained. Do children with challenging behavior who receive consultation show more behavior improvement compared to children with challenging behavior who do not receive consultation? • While children in the intervention (N=129 and comparison (N=59) groups both improved over time, probably due to maturation, the CCEP group showed greater improvements in behavior than the comparison group in almost all areas.

  23. 4. Case studies • Sample: (N=9 children) 2 programs, 3 consultants • Method: Interviews in-person or phone with parent, provider (s) and consultant • Analyses: Coded & content thematically organized around process and outcomes

  24. Case studies: strengths • Combines quantitative and qualitative methods. • Illustrates the variation and unique relevance for individual children • Adds depth to the understanding of the processes that underpin consultation • Highlight the importance of context and relationships for intervention

  25. Case studies: challenges • Balancing case study importance with a primarily outcome focused evaluation. • Self-selection bias in sample • Combining meaningfully with quantitative data- using quotes in body of report (outcomes), thematic table about process and ‘stories’ about children with standardized scores compared to mean

  26. Program’s use of preliminary evaluation results • Accountability.Was the money being spent as agreed? Was it being spent wisely? • Planning – program and community Where to focus limited resources? Was more needed? Help others understand the consultants’ role and perspective and the contribution it can make to community planning. Grant preparations. • Quality improvement.How could CCEP build on its strengths? What could CCEP have done better? Ready access to evaluator expertise offered more support. e.g. internal monitoring systems. • Advocacy & Dissemination. Tell others about CCEP successes and challenges. Politicians, potential funders, academics-contribution to the ECMH knowledge base.

  27. Closing commentsfrom Daniel’s mom “I think it’s (CCEP) an awesome program, I really do. There are a lot of daycares out there that if they come across just the littlest behavior, and the child becomes difficult to take care of, they just give up and say ‘okay, well we can’t have him in the daycare’. So someone like Julie (consultant) that could come out and talk to the caregivers and explain different ways of doing things, I mean, I think that’s awesome because then you know, the kid can stay in the daycare and the mother can continue working. I mean, I think it’s a really good program.”

  28. Further information Principal Investigators: • John Carlson, PhD, NCSP; Asc. Professor, College of Education; carlsoj@msu.edu • Holly E. Brophy-Herb, PhD, Associate Professor, Human Dev. & Family Studies; hbrophy@msu.edu • Laurie A. Van Egeren, PhD, Director, Community Evaluation and Research Center (CERC), University Outreach and Engagement; vanegere@msu.edu

  29. Useful links • MSU CCEP evaluation results Referred to here: Briefs and posters http://outreach.msu.edu/cerc/research/ccep.aspx • Technical Assistance Center for Social Emotional Intervention: http://www.challengingbehavior.org/ • University of Wisconsin – Extension: http://www.uwex.edu/ces/pdande/evaluation/index.html • NSF Online Evaluation Resource Library: http://www.oerl.sri.com • Trochim, W.M. The Research Methods Knowledge Base, 2nd Edition: http://www.socialresearchmethods.net/kb/

More Related