1 / 19

Student Growth in the Non-Tested Subjects and Grades: Options for Teacher Evaluators

Student Growth in the Non-Tested Subjects and Grades: Options for Teacher Evaluators Elena Diaz-Bilello, Center for Assessment Michael Cohen, Denver Public Schools Ruth Chung Wei, Stanford University Scott Marion, Center for Assessment Stuart Kahl, Measured Progress NCSA New Orleans

alaina
Download Presentation

Student Growth in the Non-Tested Subjects and Grades: Options for Teacher Evaluators

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Student Growth in the Non-Tested Subjects and Grades: Options for Teacher Evaluators Elena Diaz-Bilello, Center for Assessment Michael Cohen, Denver Public Schools Ruth Chung Wei, Stanford University Scott Marion, Center for Assessment Stuart Kahl, Measured Progress NCSA New Orleans June 25, 2014

  2. Alternative Assessment Strategies for Evaluating Teaching Effectiveness (AASETE) Stuart Kahl Measured Progress, Inc. 2014 NCSA, New Orleans

  3. The AASETE objective was to: design a research-based system for using performance assessments along with other instruments to measure student academic growth, which in turn, could be used with other measures of teaching effectiveness for purposes of teacher evaluation in non-tested subjects and grades.

  4. The Research • >12,000 students in three states • Approx. 250 teachers • >600 classrooms • 18 subject/grade-level/state combinations • Pre- and post-testing • Prior achievement data (test scores and grades) • Comparison of growth models

  5. Major Findings • moderate to high correlations among results for different growth models applied to data from the same (or equated) tests • quite variable correlations among indicators based on the same model, but different end-of-year tests (state vs. AASETE) or based on the same model applied to different test components (multiple-choice vs. performance)

  6. Findings (continued) • only slightly higher correlations among indicators with sophisticated scaling of student scores as opposed to raw (or linearly transformed) student scores • moderate to high correlations among indicators based on simple growth or simple prediction models and those based on more sophisticated models

  7. Features of AASETE-Recommended Approach • common end-of-course (or interim) assessments across teachers, schools, districts; multiple assessment components, including performance • less sophisticated analyses easily run with commonly used software packages such as Excel

  8. Features (continued) • a simple prediction model subtracting students’ predicted scores on a common end-of-course measure from the actual scores on that measure, and aggregating (averaging) the differences at the teacher level • human judgment in deciding if the student growth for a particular teacher is adequate given the unique characteristics of the teacher’s students, the other unique contextual factors of the teacher’s situation, and previous growth indicators for the teacher

  9. A System for Using Student Academic Growth in the Evaluation of Teaching Effectiveness in the Non-Tested Subjects and Grades A Guide for Education Policy Makers and Evaluators of Teachers Measured Progress, Inc. May 2014 This document was prepared by Measured Progress, Inc. with funding from the Bill and Melinda Gates Foundation, Grant No. OPP1029363. The content of the publication does not necessarily reflect the views of the Foundation.

  10. Table of Contents • Preface …………………………………………………………… 3 • Acknowledgments …………………………………………… 5

  11. Contents (continued) • Student Academic Growth and Teacher Evaluations …… 6 • The Problem in the Non-Tested Subjects and Grades … 6 • What Are the Options for the Non-Tested Subjects and Grades ……………………………………………………………… 7 • Test-Based Value-Added/Growth Indicators ……… 7 • Student Learning Objectives (SLOs) …………………… 8 • Comparisons of Approaches …………………………… 9 • Human Judgment and Multiple Measures ………………… 12 • Interpreting Normative Data …………………………… 12 • Weighing Multiple Measures …………………………… 13

  12. Contents (continued) • The Recommended “Simple Regression” Approach …… 15 • Common Assessments …………………………………… 15 • Why Not Simple Pre-Post Growth? ……………………. 16 • Simple Prediction/Regression ………………………… 16 • Numbers of Students and Teachers …………………… 18 • Outcome or End-of-Course Measures .……………… 19 • More on Performance Components …………………… 21

  13. Contents (continued) • Predictor Variables …………………………………………… 22 • Predictors in General …………………………………… 22 • Predictors to Use ………………………………………… 23 • Associated Analyses and Checks ………………………… 23 • Some Final Words on the Generation, Interpretation, • and Use of Value-Added/Growth Statistics …………… 25 • More on the Proposed Method …………………… 25 • How Much Work Is It? ……………………………… 26

  14. Contents (continued) • Appendix A: Overview and Recommendations • of AASETE Study ………………………………28 • Appendix B: Instructions for Performing • Regression-Based Growth Analysis • at the Teacher Level Using Excel ………… 31

  15. How Much Work Is It? • End-of-course testing already happening • Data management systems in place with data • Capability to access and use data • One new Excel analysis to learn • The challenges • Policy • Human judgment • District assessment program

  16. Thank you!

More Related