1 / 46

PERFORMANCE MONITORING USING VALUE ADDED DATA (Post – 16)

PERFORMANCE MONITORING USING VALUE ADDED DATA (Post – 16). Keith Murdoch. Overview of Presentation. Context: The College, the locality and some political drivers An overview of Woodhouse College’s Performance Monitoring approach Performance Monitoring College Departments Students

Download Presentation

PERFORMANCE MONITORING USING VALUE ADDED DATA (Post – 16)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. PERFORMANCE MONITORING USING VALUE ADDED DATA (Post – 16) Keith Murdoch

  2. Overview of Presentation • Context: The College, the locality and some political drivers • An overview of Woodhouse College’s Performance Monitoring approach • Performance Monitoring • College • Departments • Students • Concluding musings

  3. Woodhouse College: • 1,145 16-19 full time students • 99% A – level provision • 58% Female • 54% BME • Enrolments from 140+ schools

  4. Value Added & the Common Inspection Framework Outcomes for Learners • Learners’ attainment and progress Quality of Provision • Effectiveness of teaching, training and assessment in supporting learning and development • Effectiveness of the care, guidance and support learners receive Leadership and Management • Raising expectations and promoting ambition • Actively promoting equality and diversity to narrow the achievement gap • Effectiveness of self assessment

  5. PERFORMANCE MONITORING Monitoring Student Progress / Student Reviews Lesson Observation Internal Inspections Every Child Matters Diversity and Equality Self Assessment: Department / College Key Elements: Using value added data

  6. SELF ASSESSMENT • How well do you know your Department / College? • How well do you know your data? • Can you accurately identify your weaknesses? • Do you have the capacity to make improvements? • Can you provide evidence to demonstrate improvements?

  7. Performance Monitoring Using ALIS to monitor achievement & attitudes with feedback to 3 levels How are we doing and how do we know? with ALIS helps us measure: Students Progress on Course Departments Whole College Achievement Achievement & Attitudes What are we doing about the poor bits? e.g. Target Setting, Action Plans, Operational Plans, Strategic Plans

  8. Strategic & Operational Planning Dept. SAR Achievement + Action Plans Whole College SAR Dept SAR Phase 1 + Action Plans Curriculum Quality Monitoring & Internal Inspections Achievement Feedback Data Capture GCSE A/AS Yr 13Data Capture: Attitudes Input Feedback Yr 12 & 13 Reviews Yr 12 Reviews Yr 12 Reviews MAG On-course student assessment and monitoring, subject by subject

  9. Monitoring the overall performance of the COLLEGE Summative monitoring by Senior Leaders and Governors

  10. Report to Governors

  11. Monitoring the Performance of DEPARTMENTS • SELF ASSESSMENT: • summative monitoring by departments of their own performance using fair comparisons of achievement levels. • Analysing student achievement by subject

  12. Key Performance Indicators

  13. SELF-ASSSESSMENT: DEPARTMENTS • ALIS DATA ANALYSIS • Summary of Raw Results • Comparison of raw subject A-level results with the national percentage for the subject • Analysis by GCSE Score • In terms of residuals how have students in different bands been performing? • Does a preponderance of students in any one band help explain the overall residual? • Analysis by Ethnic Minority and Gender • In terms of actual scores and standardised residuals, how have students from different ethnic minority/gender groups performed? • Are the differences significant? • Analysis of Extreme Cases • Can we identify common features within the high and low achieving groups? • Do the extremes distort the overall picture of the subject performance? • Analysis of Variance • An explanation of data points which lie outside the control lines • A comment on the moving average • Analysis by Teaching Group • Is there any apparent correlation between set and residual or, over the 3 year period, between member(s) of staff teaching a set and the residuals achieved, after sets have been analysed for the range of ability? • Students’ Attitudes and learning and Teaching Processes • Analysis in trends in students’ attitudes to subject • Issues raised by analysis of perceived learning activities.

  14. Analysis by GCSE Score In terms of residuals, how have students in different bands been performing? Does a preponderance of students in any one band explain the overall residual? How does your analysis impact on strategies for teaching and learning?

  15. Analysis by Gender & Ethnic Minority In terms of standardised residuals, how have students from different e-m/gender groups performed? Are the differences significant? What impact will these differences have on your teaching and learning strategies?

  16. Analysis of Extreme Cases An extreme is a student with a raw residual of + or – 30 (15 for an AS) Can we identify common features within the low and high achieving groups? Do the extremes distort the overall picture of the subject performance?

  17. Student Level Residuals

  18. Analysis of Variance An explanation of data points which lie between control lines A detailed explanation of data points which lie outside 3SD control line A comment on the moving average

  19. Identifying strengths and weaknesses understanding your weaknesses and identifying actions for improvement impact on strategies for teaching and learning and supporting students KEY ELEMENT of SELF ASSESSMENT ANALYSIS

  20. Using ALIS data to identify and ‘unpack’ weaknesses e.g. -0.28 Standardised Residual -0.7 Female -0.1 Male -0.64 without Maths -0.19 with Maths AS Physics • Work with Maths Department on creating resources to support student not taking AS Maths • Introduce problem-solving consolidation sessions • Further investigation of girls underachievement – Institute of Physics, Standards Unit, focus groups etc

  21. e.g. … contd. -0.23 Standardised Residual • 3 distinct groups • Additionality: 0.1 • Full programme : -0.8 • Applied AS: -0.5 AS Critical Thinking • Restructure external assessment of course – January module • Increase hours for Applied AS students, formalise requirements, ‘integrate’ into main programme • Review appropriateness

  22. Monitoring the Performanceof STUDENTSto raise aspirations and achievement

  23. You know you’ve gone to Woodhouse when..... MAG/CAG. They do mean something.

  24. PROBLEM Motivating, and monitoring student progress is a FORMATIVE process • ALIS is • RETROSPECTIVE • SUMMATIVE • STATISTICAL HOW CAN WE ‘SQUARE THE CIRCLE?’ Possible because correlations are high and the variation in the association between AVGCSE and the statistically ‘PREDICTED’ A-Level grade, from one year to the next, in any given subject, tends to be very SMALL

  25. KEY PROCESSES • Departmental Monitoring and Assessment Practices which feed into • Student Reviews • Scheduled monitoring of a student’s progress across their programme • Mutually supporting academic and pastoral functions using COMMON DATA MINIMUM ACCEPTABLE GRADES

  26. BENCHMARKING and TARGET SETTING: Defining Grades • MAG: Minimum Acceptable Grade The statistically predicted grade for each subject (Scale A/B – E [E] based on the ALIS trend line) which will not change during the year. Provides an initial benchmark (with associated health warnings) against which a student’s progress can be judged. • CAG: Current Achievement Grade The grade a student is currently working at.

  27. Defining Grades (contd.) • TAG: Target Achievement Grade The grade (above the MAG) a student is considered capable of working at and should aim to achieve. Student Review discussions (where appropriate) would be focused on negotiating strategies to enable students to attain this grade. • PG: Predicted Grade The grade that is written on a student’s UCAS form and subsequently changed or confirmed as part of the return to Examination Boards

  28. CALCULATING THE MINIMUM ACCEPTABLE GRADE The student MAG for each subject is based on the previous year’s ALIS ‘trend-line’ for that subject Principle = ‘good enough for purpose not statistically flawless’ • Example: • Lisa Fry has GCSE grades 2A*, 3A, 4B and 1C • Total Points = 66 (i.e. 2x8 + 3x7 + 4x6 + 1x5) • AVGCSE = 66 / 10 = 6.6 • MAP for subject (15.78 x 6.6) – 62.65 = 41.5 • MAG = C (at AS)

  29. Technical problem!!! calculation creates number and decimal points which need to be translated into UCAS grades College Conversion Table • No student can be targeted to attain A* or A grade • No student can be targeted to ‘fail’, i.e. below grade E

  30. Progress in Individual Subjects • College policy on the monitoring of student progress and achievement Framework for • Department Policies • setting and marking students’ work • monitoring students’ progress (target setting) • use of subject tutorial period • Supporting Teaching and Learning

  31. Student Name MAG: B

  32. Student Review Cycle

  33. Start of the Student Review Process September: AVGCSE score used to calculate MAG (AS) subject by subject : included on set lists and programme record card October: Autumn Review – with Personal Tutors Scale: 1 = performing outstandingly 2 = satisfactory 3a/b = study skills &/or conceptual problems 4 = ‘alarm’ signal 5 = very recently joined subject + Effort Grade 1 - 4 AS MAG is provided as benchmark Case Conference

  34. Student Review Data

  35. Tutor Programme Review Data

  36. Some musings…… • measurement gives messages • evolve your own • invest in the time • transparency as a tool not a threat • know the health warnings • embed • never take understanding for granted • trust the tribe

More Related