1 / 22

A new scoring approach for ECERS-R

A new scoring approach for ECERS-R. Richard Clifford, PhD John Sideris, PhD Jennifer Neitzel, PhD Beatriz Abuchaim, MSc University of North Carolina FPG Child Development Institute Chapel Hill, April 2012. Previous studies.

pkates
Download Presentation

A new scoring approach for ECERS-R

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A new scoring approach for ECERS-R • Richard Clifford, PhD • John Sideris, PhD • Jennifer Neitzel, PhD • Beatriz Abuchaim, MSc • University of North Carolina • FPG Child Development Institute • Chapel Hill, April 2012

  2. Previous studies • Several studies have found that there is an underlying factor structure in the ECERS-R, beyond the subscale level (e.g., Clifford & Rossbach, 2005; Early, et al., 2005; Sakai, Whitebook, Wishard, & Howes, 2003) • Most commonly, two factors have emerged: (1) Teaching and Interaction and (2) Provisions for Learning)

  3. Previous studies (cont.) • Studies have showed that results of the ECERS-R are related to a variety of children outcomes, but this relationship is only modest (Aboud, 2006; Burchinal et al., 2000; McCartney, Scarr, Phillips, & Grajek, 1985; Phillips, McCartney, & Scarr, 1987; Buchinal et al., 2011) • Concern has been raised about factor scores and category disordering that may help explain the very modest relations to child outcomes, particularly in higher quality classrooms (Gordon, et al., in press) • Our hypothesis is that a new scoring system, using the indicator information, can improve the predictive power of the ECERS-R

  4. Goals • To develop a new scoring system, using the indicator level information • To test the predictive power of this new system

  5. Sample • 8500 cases, from 6 different studies, in which all the indicators were scored • States: California, Iowa, Minnesota, Nebraska, North Carolina, Georgia, Illinois, Kentucky, New York, Ohio, Massachusetts, New Jersey, Texas, Washington and Wisconsin • Issue with skewed distribution, few low scoring programs

  6. Procedures • Step 1 – Hypothesize a new set of factors for the ECERS-R • Step 2 – Conduct Factor Analyses • Step 3 – Conduct Confirmatory Analyses • Step 4 - Test of these new factors to check their predictive power.

  7. Hypothesized New Subscales or Factors • Use of time • Special Needs • Physical Environment • Individualization • Diversity • Access to Materials • Creativity • Grouping • Fine Motor • Gross Motor • Independence • Social/Emotional • Engagement • Routines • Teaching • Science/Math/Reasoning • Literacy/Language/Concepts • Health • Safety • Families • Staff • Supervision

  8. Characterizing each indicator

  9. Factor Analysis • All indicators for the Parents and Staff Subscale were dropped. • Multiple Factor Analyses were carried out to test the newly hypothesized factors. • Some models included multiple factors

  10. Confirmatory model • Models were confirmed on remaining half of the sample • Example: Education factor: Teaching, Literacy and Math/Science

  11. Model Fit • Across most models, fit was good • Chi-square were all significant, unsurprising given sample size • RMSEA all .04 or less • CFI ranged between .80 and .97

  12. Problematic Indicators • In all of these models, some indicators presented estimation problems and were eliminated. • Extremely low variance • Correlated at one with other indicators in the model. • Empty cells in the 2 X 2 crosstabs of pairs of indicators

  13. Problematic Indicators, Example • 6.1.2 Child related display: Inappropriate materials for predominant age group. • 99.84% of our sample passed this indicator. • Lack of variance may be due to non-random sample

  14. Problematic Models • Special Needs – one factor solution required the elimination of the majority of indicators • Use of time and routines – two factor solution not replicated in second half of the sample

  15. Exploratory Model for Health • A set of health and safety indicators were selected. • Less certain that they would represent a single factor • First analysis indicated three factors, but third factor included only about 8 of 40 indicators, all cross-loaded on first two factors

  16. Exploratory Model for Health • Factor One – General Health & Safety • 10.1.3 Sanitary conditions not usually maintained • 11.1.2 Nap/Rest provisions unsanitary • Factor Two – Supervision to Promote Health and Safety • 31.1.2 Discipline is so lax that there is little order or control • 14.3.2 Adequate supervision to protect children’s safety indoors and outdoors

  17. Expected Ordering

  18. Actual Ordering

  19. Inter-Factor Correlations

  20. Inter-Factor Correlations

  21. Correlations with Traditional Scored ECERS-R

  22. Conflict of Interest Disclosure • Richard Clifford has a financial conflict of interest as a result of receiving royalty and consulting payments in connection with use of the ECERS-R. His work on this effort is conducted under IRB approval from the University of North Carolina at Chapel Hill which includes a management plan for dealing with the conflict of interest noted here.

More Related