Measuring learning
Download
1 / 48

measuring learning: - PowerPoint PPT Presentation


  • 180 Views
  • Updated On :

12. Measuring Learning:. Where does Value Added Data fit in this Process ?. K. How do we measure learning in Cleveland ?. Formative data sources: . How do we measure learning in Cleveland ?. Formative data sources: Student work. How do we measure learning in Cleveland ?.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'measuring learning:' - Faraday


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Measuring learning l.jpg

12

Measuring Learning:

Where does Value Added Data fit in this Process?

K


How do we measure learning in cleveland l.jpg
How do we measure learning in Cleveland?

  • Formative data sources:


How do we measure learning in cleveland3 l.jpg
How do we measure learning in Cleveland?

  • Formative data sources:

    • Student work.


How do we measure learning in cleveland4 l.jpg
How do we measure learning in Cleveland?

  • Formative data sources:

    • Student work.

    • Short-cycle assessments which focus on a narrow range of skill.


How do we measure learning in cleveland5 l.jpg
How do we measure learning in Cleveland?

  • Formative data sources:

    • Student work.

    • Short-cycle assessments which focus on a narrow range of skill.

    • Benchmark tests which have been aligned to the State high-stakes testing and measure a larger skill set.


How do we measure learning in cleveland6 l.jpg
How do we measure learning in Cleveland?

  • Formative data sources:

    • Student work.

    • Short-cycle assessments which focus on a narrow range of skill.

    • Benchmark tests which have been aligned to the State high-stakes testing and measure a larger skill set.

  • These are all used to actively shape instructional process: a feedback loop.




Using formative assessment to shape instruction9 l.jpg
Using formative assessment to shape instruction:

Standard

Test to see if we are “On Course!”


Using formative assessment to shape instruction10 l.jpg
Using formative assessment to shape instruction:

Standard

Use testing to “target” instructional practices to facilitate reaching learning goals.



3 rd grade benchmark reading test l.jpg
3rd Grade Benchmark Reading Test

  • A bridge between formative and summative assessment.


3 rd grade benchmark reading test13 l.jpg
3rd Grade Benchmark Reading Test

  • A bridge between formative and summative assessment.

    • Developed by CMSD staff!


3 rd grade benchmark reading test14 l.jpg
3rd Grade Benchmark Reading Test

  • A bridge between formative and summative assessment.

    • Developed by CMSD staff!

    • Given 3 times per year – provides feedback for instruction.


3 rd grade benchmark reading test15 l.jpg
3rd Grade Benchmark Reading Test

  • A bridge between formative and summative assessment.

    • Developed by CMSD staff!

    • Given 3 times per year – provides feedback for instruction.

    • Tied directly to targeted standards based instructional materials.


3 rd grade benchmark reading test16 l.jpg
3rd Grade Benchmark Reading Test

  • A bridge between formative and summative assessment.

    • Developed by CMSD staff!

    • Given 3 times per year – provides feedback for instruction.

    • Tied directly to targeted standards based instructional materials.

    • Accurately identifies 92% of students who performed below proficient on the OAT (r=.82).


How do we measure learning in cleveland17 l.jpg
How do we measure learning in Cleveland?

  • Summative Measures (Accountability):


How do we measure learning in cleveland18 l.jpg
How do we measure learning in Cleveland?

  • Summative Measures (Accountability):

    • Proficiency Testing (being phased out).


How do we measure learning in cleveland19 l.jpg
How do we measure learning in Cleveland?

  • Summative Measures (Accountability):

    • Proficiency Testing (being phased out).

    • Achievement Testing (fully operational in the

      2007-2008 School Year)


How do we measure learning in cleveland20 l.jpg
How do we measure learning in Cleveland?

  • Summative Measures (Accountability):

    • Proficiency Testing (being phased out).

    • Achievement Testing (fully operational in the

      2007-2008 School Year)

    • Graduation Testing


How do we measure learning in cleveland21 l.jpg
How do we measure learning in Cleveland?

  • Summative Measures (Accountability):

    • Proficiency Testing (being phased out).

    • Achievement Testing (fully operational in the

      2007-2008 School Year)

    • Graduation Testing

  • One can argue that these can be used to actively shape instructional processes.


How do we measure learning in cleveland22 l.jpg
How do we measure learning in Cleveland?

  • Summative Measures (Accountability):

    • Proficiency Testing (being phased out).

    • Achievement Testing (fully operational in the

      2007-2008 School Year)

    • Graduation Testing

  • One can argue that these can be used to actively shape instructional processes.

  • However…


What will value added analyses add to these l.jpg
What will “Value Added” analyses add to these?

  • A measure of change that is relative to the students’ prior levels of achievement.


What will value added analyses add to these24 l.jpg
What will “Value Added” analyses add to these?

  • A measure of change that is relative to the students’ prior levels of achievement.

  • Traditional criterion referenced summative assessment measures whether or not one has reached a threshold level of learning irrespective of one’s starting point.



What will value added analyses add to these26 l.jpg
What will “Value Added” analyses add to these?

Who has more learning “ground” to cover in order to cross the criterion finish line?


What will value added analyses add to these27 l.jpg
What will “Value Added” analyses add to these?

  • Value Added analyses take into account where different learners “start” by looking at past performance on accountability testing.


What will value added analyses add to these28 l.jpg
What will “Value Added” analyses add to these?

  • Value Added analyses take into account where different learners “start” by looking at past performance on accountability testing.

  • Past testing data is used in a complex statistical modeling method (mixed-model a.k.a. multilevel modeling or hierarchical linear modeling) to develop predicted scores for a student based on like scoring students.


What will value added analyses add to these29 l.jpg
What will “Value Added” analyses add to these?

  • Value Added analyses take into account where different learners “start” by looking at past performance on accountability testing.

  • Past testing data is used in a complex statistical modeling method (mixed-model a.k.a. multilevel modeling or hierarchical linear modeling) to develop predicted scores for a student based on like scoring students.

  • Demographic variables are not used in the model.


What will value added analyses add to these30 l.jpg
What will “Value Added” analyses add to these?

= Value Added Predicted Scores!

Finish Line = Criterion!


What will value added analyses add to these31 l.jpg
What will “Value Added” analyses add to these?

This student is still below proficient on the accountability measure. . .


What will value added analyses add to these32 l.jpg
What will “Value Added” analyses add to these?

This student is still below proficient on the accountability measure. . .

But, he has advanced beyond his predicted level of performance – this gain beyond his predicted score would be attributed to his teacher and/or school = Value Added.


What will value added analyses add to these33 l.jpg
What will “Value Added” analyses add to these?

  • As a measure of relative change, Value Added has the promise to offer a means to credit teachers whose children show gain beyond what would be expected given their past performance; however. . .


What will value added analyses add to these34 l.jpg
What will “Value Added” analyses add to these?

  • As a measure of relative change, Value Added has the promise to offer a means to credit teachers whose children show gain beyond what would be expected given their past performance; however. . .

  • Sander’s value added method was developed in Tennessee with a significantly different testing system than Ohio’s.


Questions to be addressed when placing value added analyses in context l.jpg
Questions to be addressed when placing Value Added analyses in context:

  • As we all know, predicted scores are only as good as the information used to create them. . .


Questions to be addressed when placing value added analyses in context36 l.jpg
Questions to be addressed when placing Value Added analyses in context:

  • As we all know, predicted scores are only as good as the information used to create them. . .

    • Value added was developed in a system that used Norm referenced tests that were vertically scaled. Ohio?


Questions to be addressed when placing value added analyses in context37 l.jpg
Questions to be addressed when placing Value Added analyses in context:

  • As we all know, predicted scores are only as good as the information used to create them. . .

    • Value added was developed in a system that used Norm referenced tests that were vertically scaled. Ohio?

    • Will OPT data be used in the early years of the implementation of the Value Added accountability system in Ohio? Given the repetition of OPT items. . .


Questions to be addressed when placing value added analyses in context38 l.jpg
Questions to be addressed when placing Value Added analyses in context:

  • As we all know, predicted scores are only as good as the information used to create them. . .

    • Value added was developed in a system that used Norm referenced tests that were vertically scaled. Ohio?

    • Will OPT data be used in the early years of the implementation of the Value Added accountability system in Ohio? Given the repetition of OPT items. . .

    • Will the new achievement tests have a sufficient track record for this purpose in 2007-2008?


Questions to be addressed when placing value added analyses in context39 l.jpg
Questions to be addressed when placing Value Added analyses in context:

  • As we all know, predicted scores are only as good as the information used to create them. . .

    • Value added was developed in a system that used Norm referenced tests that were vertically scaled. Ohio?

    • Will OPT data be used in the early years of the implementation of the Value Added accountability system in Ohio? Given the repetition of OPT items. . .

    • Will the new achievement tests have a sufficient track record for this purpose in 2007-2008?

    • It is reasonable to assume that individual differences can be accounted for by past test scores?


Questions to be addressed when placing value added analyses in context40 l.jpg
Questions to be addressed when placing Value Added analyses in context:

  • As we all know, predicted scores are only as good as the information used to create them. . .

    • Value added was developed in a system that used Norm referenced tests that were vertically scaled. Ohio?

    • Will OPT data be used in the early years of the implementation of the Value Added accountability system in Ohio? Given the repetition of OPT items. . .

    • Will the new achievement tests have a sufficient track record for this purpose in 2007-2008?

    • It is reasonable to assume that individual differences can be accounted for by past test scores?

    • Do the tests have sufficient “stretch” at the ends to support inferences at extreme scores?


Questions to be addressed when placing value added analyses in context41 l.jpg
Questions to be addressed when placing Value Added analyses in context:

  • As we all know, predicted scores are only as good as the information used to create them. . .

    • Value added was developed in a system that used Norm referenced tests that were vertically scaled. Ohio?

    • Will OPT data be used in the early years of the implementation of the Value Added accountability system in Ohio? Given the repetition of OPT items. . .

    • Will the new achievement tests have a sufficient track record for this purpose in 2007-2008?

    • It is reasonable to assume that individual differences can be accounted for by past test scores?

    • Do the tests have sufficient “stretch” at the ends to support inferences at extreme scores?

    • How (does?) the model account for mobility?


More questions to be addressed when placing value added analyses in context l.jpg
More Questions to be addressed when placing Value Added analyses in context:

  • All predicted scores have error. . .

    • Value added analyses take this into account when interpreting gains for students, teachers, and schools.


More questions to be addressed when placing value added analyses in context43 l.jpg
More Questions to be addressed when placing Value Added analyses in context:

  • All predicted scores have error. . .

    • Value added analyses take this into account when interpreting gains for students, teachers, and schools.

    • This is a two-edged sword. On one hand, it minimizes the effect of any extreme scores. On the other, it means that two teachers can move students, on average, the same “distance” and receive different value added “scores” due to the standard errors associated with the predicted scores of their classrooms.


More questions to be addressed when placing value added analyses in context44 l.jpg
More Questions to be addressed when placing Value Added analyses in context:

  • All predicted scores have error. . .

    • Value added analyses take this into account when interpreting gains for students, teachers, and schools.

    • This is a two-edged sword. On one hand, it minimizes the effect of any extreme scores. On the other, it means that two teachers can move students, on average, the same “distance” and receive different value added “scores” due to the standard errors associated with the predicted scores of their classrooms.

    • This area will likely require close contextualization of the data in terms of the absolute gains in the class.


Conclusions l.jpg
Conclusions analyses in context:

  • As a measure of relative change, Value Added has been promoted to offer a means to measure the effect of individual teachers and schools on students’ rate of learning.


Conclusions46 l.jpg
Conclusions analyses in context:

  • As a measure of relative change, Value Added has been promoted to offer a means to measure the effect of individual teachers and schools on students’ rate of learning.

  • It would be of particular value as a means of formative assessment; however, it has been legislated as part of Ohio’s accountability system and will be in effect as such in 2007-2008.


Conclusions47 l.jpg
Conclusions analyses in context:

  • As a measure of relative change, Value Added has been promoted to offer a means to measure the effect of individual teachers and schools on students’ rate of learning.

  • It would be of particular value as a means of formative assessment; however, it has been legislated as part of Ohio’s accountability system and will be in effect as such in 2007-2008.

  • Therefore, CMSD will be incorporating Value Added information in our assessment of school and District function, but. . .this data will be interpreted in context of other measures of learning, and


Conclusions48 l.jpg
Conclusions analyses in context:

  • As a measure of relative change, Value Added has been promoted to offer a means to measure the effect of individual teachers and schools on students’ rate of learning.

  • It would be of particular value as a means of formative assessment; however, it has been legislated as part of Ohio’s accountability system and will be in effect as such in 2007-2008.

  • Therefore, CMSD will be incorporating Value Added information in our assessment of school and District function, but. . .this data will be interpreted in context of other measures of learning, and

  • CMSD will actively work to replicate/validate the results of the Value Added Analysis.


ad