1 / 27

Value Added Measures: Implications for Policy and Practice

Value Added Measures: Implications for Policy and Practice. Friday, May 23, 2008. Building Toward a Science of Performance Improvement: A Framework for Systematic Naturalistic Inquiry. Urban Institute Value Added Conference May 23, 2008 Anthony S. Bryk Stanford University.

wanda
Download Presentation

Value Added Measures: Implications for Policy and Practice

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Value Added Measures: Implications for Policy and Practice Friday, May 23, 2008

  2. Building Toward a Science of Performance Improvement: A Framework for Systematic Naturalistic Inquiry Urban Institute Value Added Conference May 23, 2008 Anthony S. Bryk Stanford University

  3. I. A Methodological Perspective • The information needs for continuously improving practice

  4. Are CRTs really the “gold standard” for guiding continuous improvement • What information does this RCT actually provide? • Two marginal distributions YTand YC: the distributions of outcomes under the treatment and control conditions. • Provides answers to questions that can be addressed in term of observed differences in these two marginal distributions.

  5. Evidentiary Limits of the Treatment-Control Group Paradigm • Suppose we define a treatment effect for individual i as αi. • We can estimate the mean treatment effect, μα. • But, interestingly we cannot estimate the median effect or any percentile points in the αi distribution.

  6. Evidentiary Limits (continued) • Nor can we assess any linkages between αi and how these effects might be changing over time, or depend on individual and context characteristics. • To accomplish the latter, we need to know about the treatment effect distribution conjoint with multivariate data on individual and program/context characteristics.

  7. Evidentiary Limits (continued) • Of course we can add a limited number of factors into the design and estimate these interaction effects. • So we can do something on a limited scale within the T/C paradigm • But we need to know the factors in advance (if it is RCT evidence) • And they have to be small in number

  8. Other Concerns • Generalizability of results from volunteer samples to compulsory applications of findings (voluntary association as a potential contributor to αi.) • Schools and districts typically make a single decision (The “what is right for us question?”) which again drives us back to a desire to learn about this multivariate effects distribution.

  9. Conclusions • We need a different methodology for learning about programs and the multiple factors that may affect their outcomes • Needs to be dynamic in design • An accumulating evidence strategy from multiple efforts at systematic inquiry over time • A basic system orientation—”a set of elements standing in strong interaction.” (an organized complexity) • Gathering empirical evidence about such phenomena should be organizing goal for inquiry.

  10. Conclusions • randomized studies are ingenious but also limited in terms of the types of information they return to us. • This is especially significant in the context of informing continuous improvement and moving toward a “science of performance improvement.” • See Atul Gawande, Better. • Value-added strategies have much to commend themselves in this context of use.

  11. II. New Directions: Conceptualizing and Estimating a Multivariate Distribution of Effects among Teachers and Schools: Developed through an example from a current study of the efficacy of Literacy Collaborative Professional Development on teacher practice and student learning

  12. Formal School Structure Informal Organization Example: Conceptualizing Teachers’ Take Up and Use of Literacy Professional Development Background • Willingness to engage innovation • Experiment with new practices in the classroom • Expertise • Prior experiences in comprehensive literacyteaching (ZPD) LC Intervention: amount, quality and content Of PD Impact on Student learning Classroom Literacy Practice Individual Teacher School-wide support for teacher learning * Work relations among teachers * Influence of informal leaders *professional norms * principal leadership * coach quality/role relationship * resource allocations (time) * school size Key Implication : We should expect highly variable effects!

  13. A General Methodological Approach:A Value-Added Model to Examine these EffectsWithin the Context of an Accelerated Multi-Cohort Design

  14. The Logic of a Value-Added Model for Assessing Impact on Student Learning Observed growth data v4jk vtjk ,value-added at time t v3jk Basic value added model ŷ0ijk = π0i ŷlijk=π0i + πli+ v1jk ŷ2jjk=π0i+ 2πli + vljk + v2jk ŷ3jk=π0i+ 3πli+ vljk+ v2jk+ v3jk ŷ4jk=π0i+ 4πli+ vljk+ v2jk+ v3jk+v4jk Gain from year t -1 to t = πli + υtjk v2jk Ytijk v1jk Latent individual Growth rate,π1i Latent individual initial status,π0i 0 1 2 3 4 time Note:vjk may vary over time as well.

  15. Definition of a Value-Added, • The difference between two “possible outcomes”: • the observed outcome given the teacher (and school) actually experienced in year t • and the expected outcome given an “average teacher experience” i.e. given the student’s latent academic growth rate • We define: the observed outcome given the actual teacher experienced in year t – the expected outcome given an “average” teacher experience.

  16. Current Year (SY 2007-08) An Accelerated Multi-Cohort design Grade

  17. Current Year (SY 2007-08) An Accelerated Multi-Cohort design Grade

  18. Current Year (SY 2007-08) An Accelerated Multi-Cohort design Grade

  19. Current Year (SY 2007-08) An Accelerated Multi-Cohort design Grade

  20. Estimating Overall Effects of Literacy Collaborative Implementation Black= Baseline Orange= 1st Year Implementation Green= 2nd Year Implementation v2i v1i

  21. Value-added effects Ave. student learning growth is 1.06 per academic year 95% plausible value range (.57 , 1.55) *Variability among teacher effects within schools

  22. Avg. student gain per academic year High value-added schools No effect Low value-added schools Variability in school value-added, years 1 & 2 School ID

  23. Avg. student gain per academic year High value-added schools No effect Low value-added schools Variability in teacher value-added within schools, year 1 School ID

  24. Avg. student gain per academic year High value-added schools No effect Low value-added schools Variability in teacher value-added within schools, year 2 School ID

  25. III. To Sum UP • The accelerated multi-cohort design is relatively easy to implement in school settings (a naturalistic data design). • This design coupled with a value-added analysis paradigm affords treatment effect results not easily obtainable through the “gold standard”— • A multivariate distribution of effects linked with potential sources of their variation and dynamic over time

  26. To Sum UP • More generally, an argument for an evolutionary, exploratory approach to accumulating evidence • Data designs are now practical and analytic tools exist. • Imagine if we had such information now on the 750+ schools that have been involved with LC over the past 15 years. • A stronger empirical base for a design-engineering-development orientation to the improvement of schooling.

  27. To Sum UP • Main internal validity weakness–concerns about historicity—other things co-occurring as plausible causal agents. • Main strength (external validity) —a focus on replication over time and place. • Main gain—a capacity to learn from the natural variation that occur in practice.

More Related