1 / 18

Value-Added: Some Clarification

Value-Added: Some Clarification. Presented by: Keston H. Fulcher, Ph.D. Christopher Newport University Virginia Assessment Group 3/2/2007 Note: Thanks to Dr. John T. Willse (UNC-G) for help with this presentation. Overview. What is Value-Added? Context: Historical and Current Affairs

Lucy
Download Presentation

Value-Added: Some Clarification

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Value-Added: Some Clarification Presented by: Keston H. Fulcher, Ph.D. Christopher Newport University Virginia Assessment Group 3/2/2007 Note: Thanks to Dr. John T. Willse (UNC-G) for help with this presentation.

  2. Overview • What is Value-Added? • Context: Historical and Current Affairs • Approaches to Assessing Value-Added • Misconceptions of Value-Added • Reservations

  3. Value-Added: What Is It? • Value-Added is an analytical strategy to determine the degree to which students change from the beginning to the end of a program. Astin (1985) referred to this type of change as talent development.

  4. Historical Context • 80s – Greater call for accountability for higher education regarding student learning; value-added was considerable part of discussion. • A few states embraced value-added (e.g., Missouri and Tennessee) • 90s early 00s – De-emphasis on value-added, more emphasis on minimum competency • Mid 00s – Value-added once again at center of general education discussions

  5. Current Context (National) • Catalyst? • Spellings Commission • Recommendations (U.S. Department of Education, 2006) : • “Higher education institutions should measure student learning using quality assessment data.” • “The results of student learning assessments, including value-added measurements that indicate how much students’ skills have improved over time, should be made available to students and reported in the aggregate publicly.”

  6. Current Context (National) • Spellings Commission is considering logistical issues. • NASULGC and AACSU are working on developing recommendations for a “Voluntary Accountability System.” • The College Learning Assessment (CLA) is recommended by many groups to assess student gains in higher-order learning.

  7. Current Context (Virginia) • Core Competency Assessment – State Council advocating value-added assessment for student learning. • Assessment of Student Learning Task Force charged by State Council to work out logistics. Initially, all members assessment experts. • This task force now is composed of presidents, provosts, and three assessment experts.

  8. Whoa! • Several issues here; value-added is just one of those issues. • Let’s examine just value-added.

  9. Approaches to Value-Added • Cross-Sectional • Pre-Post • Residual Analysis

  10. Cross-Sectional • Cross-Sectional: Compare scores from (say) sample of seniors against sample of freshmen on same test. • Note: Weakest of three designs; doesn’t control for differences between samples.

  11. Pre-Post • Pre-Post: Same set of students takes same test or equivalent tests at two points in time (i.e., repeated measures). • Change scores represent value-added. • Note: Conceptually most straight-forward approach to value-added. Takes considerable time to collect difference scores.

  12. Residual Analysis • Residual Analysis: Determined by comparing the difference between actual scores and the scores predicted by some variable (or a set of variables), usually SATs or ACTs - Approach used by CLA • Note: Logistically easier to implement but conceptually a bit off-target. Value-added from this approach is normative.

  13. Misconceptions of Value-Added • Misconception: Measuring change is always unreliable • In the context of pre-post design, five conditions must hold for change scores to be unreliable (Zumbo, 1999): • the correlation between testing occasion one and testing occasion two is a large positive value • the observed variance of the two testing occasions is equal • the true score variance at both occasions is equal • the reliability at both occasions is equal, and • the correlation between true scores and true change scores is negative.

  14. Other Misconceptions • Value-Added means standardized assessment across institutions. • Value-Added is the CLA instrument. • Any instrument could theoretically be used for value-added assessment.

  15. Reservations of Value-Added • Affected by attrition • Expensive (especially for pre-post) • Item memorization • Doesn’t answer all analytical questions

  16. Value-Added, Summary • Some limitations • Answers a great question • In terms of the national debate about standardized testing across the nation… • We should be clear. Are we actually criticizing value-added, or an instrument, or standardization across schools….???

  17. References • Astin A. W. (1985). Achieving Educational Excellence: A Critical Assessment of Priorities and Practices in Higher Education. San Francisco: Jossey Bass. • U.S. Department of Education. (2006). A test of leadership: Charting the future of U.S. higher education. Washington, D.C. Also available at http://www.ed.gov/about/bdscomm/list/hiedfuture/reports/pre-pub-report.pdf • Zumbo, B. D. (1999). The simple difference score as an inherently poor measure of change: Some reality, much mythology. In Bruce Thompson (Ed.). Advances in Social Science Methodology, Volume 5, (pp. 269-304). Greenwich, CT: JAI Press.

  18. Thank You!

More Related