1 / 48

Measuring Learning:

12. Measuring Learning:. Where does Value Added Data fit in this Process ?. K. How do we measure learning in Cleveland ?. Formative data sources: . How do we measure learning in Cleveland ?. Formative data sources: Student work. How do we measure learning in Cleveland ?.

Faraday
Download Presentation

Measuring Learning:

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 12 Measuring Learning: Where does Value Added Data fit in this Process? K

  2. How do we measure learning in Cleveland? • Formative data sources:

  3. How do we measure learning in Cleveland? • Formative data sources: • Student work.

  4. How do we measure learning in Cleveland? • Formative data sources: • Student work. • Short-cycle assessments which focus on a narrow range of skill.

  5. How do we measure learning in Cleveland? • Formative data sources: • Student work. • Short-cycle assessments which focus on a narrow range of skill. • Benchmark tests which have been aligned to the State high-stakes testing and measure a larger skill set.

  6. How do we measure learning in Cleveland? • Formative data sources: • Student work. • Short-cycle assessments which focus on a narrow range of skill. • Benchmark tests which have been aligned to the State high-stakes testing and measure a larger skill set. • These are all used to actively shape instructional process: a feedback loop.

  7. Using formative assessment to shape instruction: Standard

  8. Using formative assessment to shape instruction: Standard

  9. Using formative assessment to shape instruction: Standard Test to see if we are “On Course!”

  10. Using formative assessment to shape instruction: Standard Use testing to “target” instructional practices to facilitate reaching learning goals.

  11. Using formative assessment to shape instruction: Standard

  12. 3rd Grade Benchmark Reading Test • A bridge between formative and summative assessment.

  13. 3rd Grade Benchmark Reading Test • A bridge between formative and summative assessment. • Developed by CMSD staff!

  14. 3rd Grade Benchmark Reading Test • A bridge between formative and summative assessment. • Developed by CMSD staff! • Given 3 times per year – provides feedback for instruction.

  15. 3rd Grade Benchmark Reading Test • A bridge between formative and summative assessment. • Developed by CMSD staff! • Given 3 times per year – provides feedback for instruction. • Tied directly to targeted standards based instructional materials.

  16. 3rd Grade Benchmark Reading Test • A bridge between formative and summative assessment. • Developed by CMSD staff! • Given 3 times per year – provides feedback for instruction. • Tied directly to targeted standards based instructional materials. • Accurately identifies 92% of students who performed below proficient on the OAT (r=.82).

  17. How do we measure learning in Cleveland? • Summative Measures (Accountability):

  18. How do we measure learning in Cleveland? • Summative Measures (Accountability): • Proficiency Testing (being phased out).

  19. How do we measure learning in Cleveland? • Summative Measures (Accountability): • Proficiency Testing (being phased out). • Achievement Testing (fully operational in the 2007-2008 School Year)

  20. How do we measure learning in Cleveland? • Summative Measures (Accountability): • Proficiency Testing (being phased out). • Achievement Testing (fully operational in the 2007-2008 School Year) • Graduation Testing

  21. How do we measure learning in Cleveland? • Summative Measures (Accountability): • Proficiency Testing (being phased out). • Achievement Testing (fully operational in the 2007-2008 School Year) • Graduation Testing • One can argue that these can be used to actively shape instructional processes.

  22. How do we measure learning in Cleveland? • Summative Measures (Accountability): • Proficiency Testing (being phased out). • Achievement Testing (fully operational in the 2007-2008 School Year) • Graduation Testing • One can argue that these can be used to actively shape instructional processes. • However…

  23. What will “Value Added” analyses add to these? • A measure of change that is relative to the students’ prior levels of achievement.

  24. What will “Value Added” analyses add to these? • A measure of change that is relative to the students’ prior levels of achievement. • Traditional criterion referenced summative assessment measures whether or not one has reached a threshold level of learning irrespective of one’s starting point.

  25. What will “Value Added” analyses add to these? Finish Line = Criterion!

  26. What will “Value Added” analyses add to these? Who has more learning “ground” to cover in order to cross the criterion finish line?

  27. What will “Value Added” analyses add to these? • Value Added analyses take into account where different learners “start” by looking at past performance on accountability testing.

  28. What will “Value Added” analyses add to these? • Value Added analyses take into account where different learners “start” by looking at past performance on accountability testing. • Past testing data is used in a complex statistical modeling method (mixed-model a.k.a. multilevel modeling or hierarchical linear modeling) to develop predicted scores for a student based on like scoring students.

  29. What will “Value Added” analyses add to these? • Value Added analyses take into account where different learners “start” by looking at past performance on accountability testing. • Past testing data is used in a complex statistical modeling method (mixed-model a.k.a. multilevel modeling or hierarchical linear modeling) to develop predicted scores for a student based on like scoring students. • Demographic variables are not used in the model.

  30. What will “Value Added” analyses add to these? = Value Added Predicted Scores! Finish Line = Criterion!

  31. What will “Value Added” analyses add to these? This student is still below proficient on the accountability measure. . .

  32. What will “Value Added” analyses add to these? This student is still below proficient on the accountability measure. . . But, he has advanced beyond his predicted level of performance – this gain beyond his predicted score would be attributed to his teacher and/or school = Value Added.

  33. What will “Value Added” analyses add to these? • As a measure of relative change, Value Added has the promise to offer a means to credit teachers whose children show gain beyond what would be expected given their past performance; however. . .

  34. What will “Value Added” analyses add to these? • As a measure of relative change, Value Added has the promise to offer a means to credit teachers whose children show gain beyond what would be expected given their past performance; however. . . • Sander’s value added method was developed in Tennessee with a significantly different testing system than Ohio’s.

  35. Questions to be addressed when placing Value Added analyses in context: • As we all know, predicted scores are only as good as the information used to create them. . .

  36. Questions to be addressed when placing Value Added analyses in context: • As we all know, predicted scores are only as good as the information used to create them. . . • Value added was developed in a system that used Norm referenced tests that were vertically scaled. Ohio?

  37. Questions to be addressed when placing Value Added analyses in context: • As we all know, predicted scores are only as good as the information used to create them. . . • Value added was developed in a system that used Norm referenced tests that were vertically scaled. Ohio? • Will OPT data be used in the early years of the implementation of the Value Added accountability system in Ohio? Given the repetition of OPT items. . .

  38. Questions to be addressed when placing Value Added analyses in context: • As we all know, predicted scores are only as good as the information used to create them. . . • Value added was developed in a system that used Norm referenced tests that were vertically scaled. Ohio? • Will OPT data be used in the early years of the implementation of the Value Added accountability system in Ohio? Given the repetition of OPT items. . . • Will the new achievement tests have a sufficient track record for this purpose in 2007-2008?

  39. Questions to be addressed when placing Value Added analyses in context: • As we all know, predicted scores are only as good as the information used to create them. . . • Value added was developed in a system that used Norm referenced tests that were vertically scaled. Ohio? • Will OPT data be used in the early years of the implementation of the Value Added accountability system in Ohio? Given the repetition of OPT items. . . • Will the new achievement tests have a sufficient track record for this purpose in 2007-2008? • It is reasonable to assume that individual differences can be accounted for by past test scores?

  40. Questions to be addressed when placing Value Added analyses in context: • As we all know, predicted scores are only as good as the information used to create them. . . • Value added was developed in a system that used Norm referenced tests that were vertically scaled. Ohio? • Will OPT data be used in the early years of the implementation of the Value Added accountability system in Ohio? Given the repetition of OPT items. . . • Will the new achievement tests have a sufficient track record for this purpose in 2007-2008? • It is reasonable to assume that individual differences can be accounted for by past test scores? • Do the tests have sufficient “stretch” at the ends to support inferences at extreme scores?

  41. Questions to be addressed when placing Value Added analyses in context: • As we all know, predicted scores are only as good as the information used to create them. . . • Value added was developed in a system that used Norm referenced tests that were vertically scaled. Ohio? • Will OPT data be used in the early years of the implementation of the Value Added accountability system in Ohio? Given the repetition of OPT items. . . • Will the new achievement tests have a sufficient track record for this purpose in 2007-2008? • It is reasonable to assume that individual differences can be accounted for by past test scores? • Do the tests have sufficient “stretch” at the ends to support inferences at extreme scores? • How (does?) the model account for mobility?

  42. More Questions to be addressed when placing Value Added analyses in context: • All predicted scores have error. . . • Value added analyses take this into account when interpreting gains for students, teachers, and schools.

  43. More Questions to be addressed when placing Value Added analyses in context: • All predicted scores have error. . . • Value added analyses take this into account when interpreting gains for students, teachers, and schools. • This is a two-edged sword. On one hand, it minimizes the effect of any extreme scores. On the other, it means that two teachers can move students, on average, the same “distance” and receive different value added “scores” due to the standard errors associated with the predicted scores of their classrooms.

  44. More Questions to be addressed when placing Value Added analyses in context: • All predicted scores have error. . . • Value added analyses take this into account when interpreting gains for students, teachers, and schools. • This is a two-edged sword. On one hand, it minimizes the effect of any extreme scores. On the other, it means that two teachers can move students, on average, the same “distance” and receive different value added “scores” due to the standard errors associated with the predicted scores of their classrooms. • This area will likely require close contextualization of the data in terms of the absolute gains in the class.

  45. Conclusions • As a measure of relative change, Value Added has been promoted to offer a means to measure the effect of individual teachers and schools on students’ rate of learning.

  46. Conclusions • As a measure of relative change, Value Added has been promoted to offer a means to measure the effect of individual teachers and schools on students’ rate of learning. • It would be of particular value as a means of formative assessment; however, it has been legislated as part of Ohio’s accountability system and will be in effect as such in 2007-2008.

  47. Conclusions • As a measure of relative change, Value Added has been promoted to offer a means to measure the effect of individual teachers and schools on students’ rate of learning. • It would be of particular value as a means of formative assessment; however, it has been legislated as part of Ohio’s accountability system and will be in effect as such in 2007-2008. • Therefore, CMSD will be incorporating Value Added information in our assessment of school and District function, but. . .this data will be interpreted in context of other measures of learning, and

  48. Conclusions • As a measure of relative change, Value Added has been promoted to offer a means to measure the effect of individual teachers and schools on students’ rate of learning. • It would be of particular value as a means of formative assessment; however, it has been legislated as part of Ohio’s accountability system and will be in effect as such in 2007-2008. • Therefore, CMSD will be incorporating Value Added information in our assessment of school and District function, but. . .this data will be interpreted in context of other measures of learning, and • CMSD will actively work to replicate/validate the results of the Value Added Analysis.

More Related