200 likes | 335 Views
Assessment and Monitoring. Christine.Merrell@cem.dur.ac.uk. www.cemcentre.org. Centre for Evaluation and Monitoring (CEM). Part of Durham University Monitoring Systems 1.1 million assessments delivered each year Pupils aged 3 – 18 years CEM systems used in 44 countries. Why assess?.
E N D
Assessment and Monitoring Christine.Merrell@cem.dur.ac.uk www.cemcentre.org
Centre for Evaluation and Monitoring (CEM) • Part of Durham University • Monitoring Systems • 1.1 million assessments delivered each year • Pupils aged 3 – 18 years • CEM systems used in 44 countries
Why assess? • Newton (2007) 18 purposes for which educational judgements may be used • Not exhaustive • Expanding all the time • Interesting perspective on the historical debate about formative and summative assessment
Why assess? • From birth ~ weight, hearing etc. • Qualifications ~ driving test, GCSEs, degree • Profile of strengths and weaknesses for planning appropriate learning experiences • Indicator of special educational needs • Monitor progress and attitudes of pupils and cohorts over time
Comparisons • Children within a class • Groups such as boys/girls • Classes within a year-group • Current cohorts with previous ones • Other schools within a consortium and nationally • Progress over time • Research • Within school • Nationally and internationally
Layers of information (different levels of detail): • Diagnostic at pupil-level • Group and class trends • School-level information (including trends over time) • Consortium/Authority-level
Do you know the psychometric properties of the assessments that you use? Are these methods always reliable and valid? Early Years Foundation Stage Profile? Age 2 – 3 progress check Year 1 phonics check Ways to Assess and Cautions • Judgements • Observations • E.g. Behaviour • Objective tests • E.g. Attainment • Practical • Pencil and paper • Computer-delivered
Standardised Assessment • Administration procedure • Comparison against representative norms
Example of a high-level use of standardised assessment • Standards over time • Consistent content • Consistent sample of schools Merrell, C. and Tymms, P. (2011) Changes in Children’s Cognitive Development at the Start of School in England 2001 – 2008, Oxford Review of Education, Vol. 37 (3), June 2011, p333 – 345.
Initiatives in the Early Years • Foundation Stage • Age 3 – 5 years • Increased nursery provision • Curriculum • Assessment • Sure Start • Education Action Zones • Etc. Etc. Have these initiatives changed children’s cognitive development at the start of school?
PIPS On-entryBaseline Assessment • Computer delivered assessment of: • Vocabulary • Phonological awareness & early reading • Early mathematics • Additional data: • Date of birth • Sex • First Language
Test/Re-test Reliability = 0.98 • Internal Reliability (Cronbach’s alpha) = 0.94 • Correlation with attainment at age 11 = 0.68
472 State Schools in England Sample
Mean PIPS Raw Scores Effect Sizes 0.11 -0.07
GLMs to analyse changes in BLA scores in relation to: • Year • EAL • Sex • Age at test was entered as covariate Main Effects & Interactions
Conclusions • Statistically significant decrease from 2001 to 2008 for early reading and picture vocabulary. However, the effect sizes of the differences were small. • Significant increase in early maths scores although again the effect size was small.
Why Should the BLA Scores Remain So Stable Over Time? • Success from early interventions aimed at influencing the development of young children is difficult to achieve • Reduction in efficacy of small-scale programmes when rolled out • Limitations with data analysed but nevertheless this is a large dataset that adds to current studies of trends over time
Thank you Christine.Merrell@cem.dur.ac.uk