Value added systems presentation to the isbe performance evaluation advisory council
This presentation is the property of its rightful owner.
Sponsored Links
1 / 52

Value-Added Systems Presentation to the ISBE Performance Evaluation Advisory Council PowerPoint PPT Presentation


  • 63 Views
  • Uploaded on
  • Presentation posted in: General

Value-Added Systems Presentation to the ISBE Performance Evaluation Advisory Council. Dr. Robert H. Meyer Research Professor and Director Value-Added Research Center University of Wisconsin-Madison February 25, 2011. Attainment and Gain.

Download Presentation

Value-Added Systems Presentation to the ISBE Performance Evaluation Advisory Council

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Value added systems presentation to the isbe performance evaluation advisory council

Value-Added SystemsPresentation to the ISBE Performance Evaluation Advisory Council

Dr. Robert H. Meyer

Research Professor and Director

Value-Added Research Center

University of Wisconsin-Madison

February 25, 2011


Attainment and gain

Attainment and Gain

  • Attainment – a “point in time” measure of student proficiency

    • compares the measured proficiency rate with a predefined proficiency goal.

  • Gain –measures average gain in student scores from one year to the next


Attainment and gain

Attainment versus Gain

Gain

Gain

Gain

Grade 3

Grade 4

Grade 5

Grade 6

Grade 7

Grade 8


Attainment and gain

Growth: Starting Point Matters

Reading results of a cohort of students at two schools

Grade 4 Proficient Cutoff 438

Grade 5 Proficient Cutoff 463

*Scale Score Average is below Proficient

Example assumes beginning of year testing


Value added

Value-Added

  • A kind of growth model that measures the contribution of schooling to student performance on the standardized tests

  • Uses statistical techniques to separate the impact of schooling from other factors that may influence growth

  • Focuses on how much students improve on the tests from one year to the next as measured in scale score points


Value added model definition

Value-Added Model Definition

  • A value-added model (VAM) is a quasi-experimental statistical model that yields estimates of the contribution of schools, classrooms, teachers, or other educational units to student achievement, controlling for non-school sources of student achievement growth, including prior student achievement and student and family characteristics.

  • A VAM produces estimates of productivity under the counterfactual assumption that all schools serve the same group of students. This facilitates apples-to-apples school comparisons rather than apples-to-oranges comparisons.

  • The objective is to facilitate valid and fair comparisons of productivity with respect to student outcomes, given that schools may serve very different student populations.


A more transparent and useful definition of va

A More Transparent (and Useful) Definition of VA

  • Value-added productivity is the difference between actual student achievement and predicted student achievement.

  • Or, value-added productivity is the difference between actual student achievement and the average achievement of a comparable group of students (where comparability is defined by a set of characteristics such a prior achievement, poverty and ELL status).


In english

In English

Post-on-Pre

Link

x

Pretest

Posttest

=

Student

Characteristics

School Effects

Unobserved

Factors

+

+

+

Value

Added


Varc philosophy

VARC Philosophy

  • Development and implementation of a value-added system should be structured as a continuous improvement process that allows for full participation of stakeholders

  • Model Co-Build; Complete customization

    • Analysis

    • Reporting

  • Value–added is one tool in a toolbox with multiple indicators


Varc value added partners

VARC Value-Added Partners

  • Design of Wisconsin State Value-Added System (1989)

  • Minneapolis (1992)

  • Milwaukee (1996)

  • Madison (2008)

  • Wisconsin Value-Added System (2009)

  • Milwaukee Area Public and Private Schools (2009)

  • Racine (2009)

  • Chicago (2006)

  • Department of Education: Teacher Incentive Fund (TIF) (2006 and 2010)

  • New York City (2009)

  • Minnesota, North Dakota & South Dakota: Teacher Education Institutions and Districts (2009)

  • Illinois (2010)

  • Hillsborough County , FL (2010)

  • Broward County, FL (2010)

  • Atlanta (2010)

  • Los Angeles (2010)

  • Tulsa (2010)


Districts and states working with varc

Districts and States working with VARC

Minneapolis

Milwaukee

Madison

Racine

Chicago

New York City

Los Angeles

Tulsa

Atlanta

Hillsborough

County

Broward

County


Measuring knowledge

Measuring knowledge

  • Many factors influence what a student learns and how their knowledge is measured

  • A variety of measures, including (but not limited to) assessments, tell us what a student knows at a point in time.

  • What are some ways we measure knowledge?


Measuring knowledge1

Measuring knowledge

End-of-course Exam

Diagnostic Test

MAP

WKCE

Daily Journal

Unit Project

After-school Activities

Hands-on Project


The simple logic of value added analysis

The Simple Logic of Value-Added Analysis

  • School Value-Added Report

    • School specific data

    • Grade level value-added

  • Comparison Value-Added Reports

    • Compare a school to other schools in the district, CESA, or state

    • Also allows for grade level comparisons

  • Tabular Data available for School Report and Comparison Reports


Attainment and value added

Attainment and Value-Added


How complex should a value added model be

How complex should a value-added model be?

  • Rule: "Simpler is better, unless it is wrong.“

  • Implies need for “quality of indicator/ quality of model” diagnostics.


Model features

Model Features

  • Demographics

  • Posttest on pretest link

  • Measurement error

  • Student mobility: dose model

  • Classroom vs. teacher: unit vs. agent

  • Differential effects

  • Selection bias mitigation: longitudinal data

  • Test property analysis


Map vs isat

MAP vs. ISAT

  • MAP dates: September, January, May

  • MAP: uses Rasch equating

    • ISAT: 3PL

  • MAP: slightly higher reliability - ~0.96 in math, ~0.94 in reading

    • ISAT math ~0.93, reading ~0.9

  • Cut scores on MAP are determined by equipercentile equating to ISAT


Attainment and gain

Minimal correlation between initial status and value-added


Grade level statewide results

Grade-Level Statewide Results


Attainment and gain

Grade-Level Statewide Results


Attainment and gain

Grade-Level Statewide Results


Attainment and gain

MPS and MMSD Value-Added compared to Wisconsin

6th to 7th Grade (Nov 2006 – Nov 2007) Mathematics – State VA Model School Effects

MPS

School Effects

MMSD

School Effects

School/District VA Productivity Parameters in WKCE Scale Score Units

(Relative to State)


Visit the varc website

Visit the VARC Website

http://varc.wceruw.org/

for more information about VARC and value-added


  • Login