Rob Smith: CEM Inset Provider. Working with Colleagues, Parents and Students. Course: Using CEM Data in Practice Day 2 Session 3 Thursday 28 th February 2013. firstname.lastname@example.org. Interpreting IPRs Exercise. Have a look at the three IPRs on the following pages.
Have a look at the three IPRs on the following pages.
What do the scores suggest about the students and how would you use this information to aid the teaching and learning process for each of them?
You are given data relating to an institution where students completed the ALIS computer adaptive test. They are chosen because they show significant differences between the various parts of the test. Remember scores are standardised around 100.
a) Are there any apparent mismatches between the subjects being followed and this data?
b) What support can be given to those students who have weaknesses in Vocabulary or Mathematics ?
c) How might predictions made for these students be tempered in the light of the inconsistencies in the test components and missing average GCSE points scores?
What are the Strengths and weaknesses of this A/AS level student?
To use the IPR (Individual pupil record) familiarise yourself with the terms standard score, band, stanine, percentile and confidence band
b) This student chose English, Film Studies, Music Technology and Psychology.
Is this a good choice? Do you foresee any problems?
Here is the Individual Pupil Record from the ALIS computer adaptive test done in Year 12 for a current Year 13 student.
This student had a high positive value added in every GCSE subject as measured using MidYIS as a baseline.
( Average GCSE score 7.44)
On the next page are her A level predictions and chances graphs
Why are the predictions different?
Are the chances graphs useful here?
Using PARIS software and tweaking the predictions for prior value added by these subjects, then from a GCSE baseline A*s are predicted in three of the four.
If we did the same for the adaptive test baseline solid Bs might be predicted in all three.
It is also worth looking at the value added at GCSE. See commentary
The differences in prediction from the GCSE baseline and the computer adaptive test for some students are interesting and these can be in either direction. Here there has been a very large value added at GCSE which may or may not be sustainable at A level. This student’s history is shown below
The value added here at GCSE is between 1 and 2 grades (for all institution data at year 7) and significantly positive for subjects (for the Independent school data from year 9)
Actually if we measure this student’s value added from an average GCSE score of 7.44 next year, it does not tell the whole story. We need to look as well at the value added from the adaptive test too.
The chances graphs should be used with extreme caution here and the growth mindset is vital if used with students
Heads of Department
This will be interpreted as a personalised prediction
The data doesn’t work for this particular student
You’re raising false expectation – he’ll never get that result
You’re making us accountable for guaranteeing particular grades – when the pupils don’t get them we’ll get sacked and the school will get sued
Remind them that:
Baseline data can give useful information about a pupil’s strengths and weaknesses which can assist teaching and learning
“Predictions” are not a substitute for their professional judgement
Reassure them that:
It is not a “witch hunt”
Value added data is used to assess pupil performance not teacher performance!