measuring learning l.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Measuring Learning: PowerPoint Presentation
Download Presentation
Measuring Learning:

Loading in 2 Seconds...

play fullscreen
1 / 48

Measuring Learning: - PowerPoint PPT Presentation


  • 186 Views
  • Uploaded on

12. Measuring Learning:. Where does Value Added Data fit in this Process ?. K. How do we measure learning in Cleveland ?. Formative data sources: . How do we measure learning in Cleveland ?. Formative data sources: Student work. How do we measure learning in Cleveland ?.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Measuring Learning:' - Faraday


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
measuring learning

12

Measuring Learning:

Where does Value Added Data fit in this Process?

K

how do we measure learning in cleveland3
How do we measure learning in Cleveland?
  • Formative data sources:
    • Student work.
how do we measure learning in cleveland4
How do we measure learning in Cleveland?
  • Formative data sources:
    • Student work.
    • Short-cycle assessments which focus on a narrow range of skill.
how do we measure learning in cleveland5
How do we measure learning in Cleveland?
  • Formative data sources:
    • Student work.
    • Short-cycle assessments which focus on a narrow range of skill.
    • Benchmark tests which have been aligned to the State high-stakes testing and measure a larger skill set.
how do we measure learning in cleveland6
How do we measure learning in Cleveland?
  • Formative data sources:
    • Student work.
    • Short-cycle assessments which focus on a narrow range of skill.
    • Benchmark tests which have been aligned to the State high-stakes testing and measure a larger skill set.
  • These are all used to actively shape instructional process: a feedback loop.
using formative assessment to shape instruction9
Using formative assessment to shape instruction:

Standard

Test to see if we are “On Course!”

using formative assessment to shape instruction10
Using formative assessment to shape instruction:

Standard

Use testing to “target” instructional practices to facilitate reaching learning goals.

3 rd grade benchmark reading test
3rd Grade Benchmark Reading Test
  • A bridge between formative and summative assessment.
3 rd grade benchmark reading test13
3rd Grade Benchmark Reading Test
  • A bridge between formative and summative assessment.
    • Developed by CMSD staff!
3 rd grade benchmark reading test14
3rd Grade Benchmark Reading Test
  • A bridge between formative and summative assessment.
    • Developed by CMSD staff!
    • Given 3 times per year – provides feedback for instruction.
3 rd grade benchmark reading test15
3rd Grade Benchmark Reading Test
  • A bridge between formative and summative assessment.
    • Developed by CMSD staff!
    • Given 3 times per year – provides feedback for instruction.
    • Tied directly to targeted standards based instructional materials.
3 rd grade benchmark reading test16
3rd Grade Benchmark Reading Test
  • A bridge between formative and summative assessment.
    • Developed by CMSD staff!
    • Given 3 times per year – provides feedback for instruction.
    • Tied directly to targeted standards based instructional materials.
    • Accurately identifies 92% of students who performed below proficient on the OAT (r=.82).
how do we measure learning in cleveland17
How do we measure learning in Cleveland?
  • Summative Measures (Accountability):
how do we measure learning in cleveland18
How do we measure learning in Cleveland?
  • Summative Measures (Accountability):
    • Proficiency Testing (being phased out).
how do we measure learning in cleveland19
How do we measure learning in Cleveland?
  • Summative Measures (Accountability):
    • Proficiency Testing (being phased out).
    • Achievement Testing (fully operational in the

2007-2008 School Year)

how do we measure learning in cleveland20
How do we measure learning in Cleveland?
  • Summative Measures (Accountability):
    • Proficiency Testing (being phased out).
    • Achievement Testing (fully operational in the

2007-2008 School Year)

    • Graduation Testing
how do we measure learning in cleveland21
How do we measure learning in Cleveland?
  • Summative Measures (Accountability):
    • Proficiency Testing (being phased out).
    • Achievement Testing (fully operational in the

2007-2008 School Year)

    • Graduation Testing
  • One can argue that these can be used to actively shape instructional processes.
how do we measure learning in cleveland22
How do we measure learning in Cleveland?
  • Summative Measures (Accountability):
    • Proficiency Testing (being phased out).
    • Achievement Testing (fully operational in the

2007-2008 School Year)

    • Graduation Testing
  • One can argue that these can be used to actively shape instructional processes.
  • However…
what will value added analyses add to these
What will “Value Added” analyses add to these?
  • A measure of change that is relative to the students’ prior levels of achievement.
what will value added analyses add to these24
What will “Value Added” analyses add to these?
  • A measure of change that is relative to the students’ prior levels of achievement.
  • Traditional criterion referenced summative assessment measures whether or not one has reached a threshold level of learning irrespective of one’s starting point.
what will value added analyses add to these26
What will “Value Added” analyses add to these?

Who has more learning “ground” to cover in order to cross the criterion finish line?

what will value added analyses add to these27
What will “Value Added” analyses add to these?
  • Value Added analyses take into account where different learners “start” by looking at past performance on accountability testing.
what will value added analyses add to these28
What will “Value Added” analyses add to these?
  • Value Added analyses take into account where different learners “start” by looking at past performance on accountability testing.
  • Past testing data is used in a complex statistical modeling method (mixed-model a.k.a. multilevel modeling or hierarchical linear modeling) to develop predicted scores for a student based on like scoring students.
what will value added analyses add to these29
What will “Value Added” analyses add to these?
  • Value Added analyses take into account where different learners “start” by looking at past performance on accountability testing.
  • Past testing data is used in a complex statistical modeling method (mixed-model a.k.a. multilevel modeling or hierarchical linear modeling) to develop predicted scores for a student based on like scoring students.
  • Demographic variables are not used in the model.
what will value added analyses add to these30
What will “Value Added” analyses add to these?

= Value Added Predicted Scores!

Finish Line = Criterion!

what will value added analyses add to these31
What will “Value Added” analyses add to these?

This student is still below proficient on the accountability measure. . .

what will value added analyses add to these32
What will “Value Added” analyses add to these?

This student is still below proficient on the accountability measure. . .

But, he has advanced beyond his predicted level of performance – this gain beyond his predicted score would be attributed to his teacher and/or school = Value Added.

what will value added analyses add to these33
What will “Value Added” analyses add to these?
  • As a measure of relative change, Value Added has the promise to offer a means to credit teachers whose children show gain beyond what would be expected given their past performance; however. . .
what will value added analyses add to these34
What will “Value Added” analyses add to these?
  • As a measure of relative change, Value Added has the promise to offer a means to credit teachers whose children show gain beyond what would be expected given their past performance; however. . .
  • Sander’s value added method was developed in Tennessee with a significantly different testing system than Ohio’s.
questions to be addressed when placing value added analyses in context
Questions to be addressed when placing Value Added analyses in context:
  • As we all know, predicted scores are only as good as the information used to create them. . .
questions to be addressed when placing value added analyses in context36
Questions to be addressed when placing Value Added analyses in context:
  • As we all know, predicted scores are only as good as the information used to create them. . .
    • Value added was developed in a system that used Norm referenced tests that were vertically scaled. Ohio?
questions to be addressed when placing value added analyses in context37
Questions to be addressed when placing Value Added analyses in context:
  • As we all know, predicted scores are only as good as the information used to create them. . .
    • Value added was developed in a system that used Norm referenced tests that were vertically scaled. Ohio?
    • Will OPT data be used in the early years of the implementation of the Value Added accountability system in Ohio? Given the repetition of OPT items. . .
questions to be addressed when placing value added analyses in context38
Questions to be addressed when placing Value Added analyses in context:
  • As we all know, predicted scores are only as good as the information used to create them. . .
    • Value added was developed in a system that used Norm referenced tests that were vertically scaled. Ohio?
    • Will OPT data be used in the early years of the implementation of the Value Added accountability system in Ohio? Given the repetition of OPT items. . .
    • Will the new achievement tests have a sufficient track record for this purpose in 2007-2008?
questions to be addressed when placing value added analyses in context39
Questions to be addressed when placing Value Added analyses in context:
  • As we all know, predicted scores are only as good as the information used to create them. . .
    • Value added was developed in a system that used Norm referenced tests that were vertically scaled. Ohio?
    • Will OPT data be used in the early years of the implementation of the Value Added accountability system in Ohio? Given the repetition of OPT items. . .
    • Will the new achievement tests have a sufficient track record for this purpose in 2007-2008?
    • It is reasonable to assume that individual differences can be accounted for by past test scores?
questions to be addressed when placing value added analyses in context40
Questions to be addressed when placing Value Added analyses in context:
  • As we all know, predicted scores are only as good as the information used to create them. . .
    • Value added was developed in a system that used Norm referenced tests that were vertically scaled. Ohio?
    • Will OPT data be used in the early years of the implementation of the Value Added accountability system in Ohio? Given the repetition of OPT items. . .
    • Will the new achievement tests have a sufficient track record for this purpose in 2007-2008?
    • It is reasonable to assume that individual differences can be accounted for by past test scores?
    • Do the tests have sufficient “stretch” at the ends to support inferences at extreme scores?
questions to be addressed when placing value added analyses in context41
Questions to be addressed when placing Value Added analyses in context:
  • As we all know, predicted scores are only as good as the information used to create them. . .
    • Value added was developed in a system that used Norm referenced tests that were vertically scaled. Ohio?
    • Will OPT data be used in the early years of the implementation of the Value Added accountability system in Ohio? Given the repetition of OPT items. . .
    • Will the new achievement tests have a sufficient track record for this purpose in 2007-2008?
    • It is reasonable to assume that individual differences can be accounted for by past test scores?
    • Do the tests have sufficient “stretch” at the ends to support inferences at extreme scores?
    • How (does?) the model account for mobility?
more questions to be addressed when placing value added analyses in context
More Questions to be addressed when placing Value Added analyses in context:
  • All predicted scores have error. . .
    • Value added analyses take this into account when interpreting gains for students, teachers, and schools.
more questions to be addressed when placing value added analyses in context43
More Questions to be addressed when placing Value Added analyses in context:
  • All predicted scores have error. . .
    • Value added analyses take this into account when interpreting gains for students, teachers, and schools.
    • This is a two-edged sword. On one hand, it minimizes the effect of any extreme scores. On the other, it means that two teachers can move students, on average, the same “distance” and receive different value added “scores” due to the standard errors associated with the predicted scores of their classrooms.
more questions to be addressed when placing value added analyses in context44
More Questions to be addressed when placing Value Added analyses in context:
  • All predicted scores have error. . .
    • Value added analyses take this into account when interpreting gains for students, teachers, and schools.
    • This is a two-edged sword. On one hand, it minimizes the effect of any extreme scores. On the other, it means that two teachers can move students, on average, the same “distance” and receive different value added “scores” due to the standard errors associated with the predicted scores of their classrooms.
    • This area will likely require close contextualization of the data in terms of the absolute gains in the class.
conclusions
Conclusions
  • As a measure of relative change, Value Added has been promoted to offer a means to measure the effect of individual teachers and schools on students’ rate of learning.
conclusions46
Conclusions
  • As a measure of relative change, Value Added has been promoted to offer a means to measure the effect of individual teachers and schools on students’ rate of learning.
  • It would be of particular value as a means of formative assessment; however, it has been legislated as part of Ohio’s accountability system and will be in effect as such in 2007-2008.
conclusions47
Conclusions
  • As a measure of relative change, Value Added has been promoted to offer a means to measure the effect of individual teachers and schools on students’ rate of learning.
  • It would be of particular value as a means of formative assessment; however, it has been legislated as part of Ohio’s accountability system and will be in effect as such in 2007-2008.
  • Therefore, CMSD will be incorporating Value Added information in our assessment of school and District function, but. . .this data will be interpreted in context of other measures of learning, and
conclusions48
Conclusions
  • As a measure of relative change, Value Added has been promoted to offer a means to measure the effect of individual teachers and schools on students’ rate of learning.
  • It would be of particular value as a means of formative assessment; however, it has been legislated as part of Ohio’s accountability system and will be in effect as such in 2007-2008.
  • Therefore, CMSD will be incorporating Value Added information in our assessment of school and District function, but. . .this data will be interpreted in context of other measures of learning, and
  • CMSD will actively work to replicate/validate the results of the Value Added Analysis.