1 / 30

16 th November 2009 Aston University - Birmingham

Data recording for success rate College MI Discussion Event. Nick Linford. 16 th November 2009 Aston University - Birmingham. Plan for the day. 10.00 Welcome and introductions. 10.10 The background. 10.30 Data recording examples – can we categorise?. 11.15 Break for drinks and biscuits.

davina
Download Presentation

16 th November 2009 Aston University - Birmingham

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Data recording for success rateCollege MI Discussion Event Nick Linford 16th November 2009Aston University - Birmingham

  2. Plan for the day 10.00 Welcome and introductions 10.10 The background 10.30 Data recording examples – can we categorise? 11.15 Break for drinks and biscuits 11.30 Data recording and audit experiences 12.30 Lunch 13.30 Warm welcome to Pete Ashton from The IA 14.30 Launch of ADaM, then break for cake 15.00 More time with Pete (incl. 2010/11 changes) 16.00 End

  3. Welcome and introductions Nick Linfordand you: 110 delegates, representing 78 colleges Picture taken at the event What happens after today?> Nick will circulate the slides and a written report> Nick will give findings and queries to LSC project group

  4. The background

  5. 80 75 70 Success rate % 65 60 55 50 97/98 98/99 99/00 00/01 01/02 02/03 03/04 04/05 05/06 06/07 07/08 Academic year Background: success rates Success rates (FE headline overall) 80% 53% Average increase of three percentage points per year

  6. Background: pressures 2004/05: Plan-led funding introduced > No funding for over-achievement > Reduction in frequency and depth of ILR audit 2007/08: Minimum Levels of Performance Introduced 2008/09: Success Factor introduced using 06/07 data 2008/09: Framework for Excellence introduced Dec 2008: Baby P case exposes Ofsted reliance on data “A largely data-based [Ofsted] review of the entire council judged it good". Mar 2009: Ofsted ask LSC to investigate college data

  7. Background: the letter May and June ’09: KPMG visit seven colleges for report Sept 23rd: LSC write to all colleges regarding report &‘inconsistent and sometimes inappropriate reporting’ October16th in TES newspaper: Mid October: David Willets requests report in a Parliamentary question after receiving a letter from a whistleblower claiming practice “endemic” End Oct: Summary of report placed on web by parliament November 6th in TES:

  8. Background: the report Particular attention was paid to F04 and F05 variances Of 369 colleges, average of 5.8% difference in starts between F04 and F05. 42 colleges had >10% difference 6 colleges visited with big inconsistencies, and one without 52 (74%) of reviewed learning aims not fully verified Up to 40% SR differences for selected aims at one college “a worryingly high % of data errors and inconsistencies” No comment or response in report from the seven colleges But very selective sample with less than 0.1% of all 16-18 starts reviewed, so impossible to say “endemic”?

  9. Data recording examples

  10. Data examples: in report Can we agree that these are always inappropriate? • Selective removal of starts when over funding target • Recording starts late and only when likely to achieve • Additional qualifications removed when not achieved • Changing end dates retrospectively to delay outcome or make a failed long qual short or achieved short qual long • Coding October as last date of attendance when not true • Non-LSC aims selectively excluded from ILR • Coding as transfer at or near planned end date • Late learning aim changes, after the ‘funding start period’

  11. Data examples: in letter The letter emphasised four things, any grey areas? • All LSC and non-LSC funded enrolments in ILR • Once submitted as funded this must not be changed • Planned end date should not be altered • Transfers must be transferred to something Is there a tension between accuracy (first principle in LSC letter) and recording only planned outcomes?

  12. Data examples: Key Skills “In order to achieve the full Key Skills qualification a learner has to undertake and achieve an end-test and portfolio of evidence. However, learners who achieve the Key Skills end-test and thus partially achieve the Key Skills qualification are included in the count towards the target. This is because the Key Skills end tests at levels 1 and 2 draw on the same set of questions as the end test for the Certificate in Adult Literacy and Certificate in Adult Numeracy at levels 1 and 2 (also known as the national test).” LSC Delivering Skills for Life Fact sheet number 9 (May 2009) Are key skills learning aims being changed to basic skills?

  13. Data examples: KS and ESOL KS achievement of portfolio and test at different levels Candidates who achieve the two components at different levels can be awarded the qualification at the lower level achieved (for example a candidate who passes the test at level 2 but whose portfolio only meets level 1 requirements can be awarded the qualification at level 1). http://www.qcda.gov.uk/6466.aspx Providers may wish to consider recording learners achievement level the same as the level of the lowest achieved unit, as is the case for the Certificate in Adult Numeracy and Certificate in ESOL LSC Delivering Skills for Life Fact sheet number 9 (May 2009) Grey areas?

  14. Data issues, Ofsted and Audit

  15. Background: Ofsted & audit Ofsted have been writing to colleges due and inspection regarding success rate data credibility (see text in your pack) “we will be putting in place the same tests on 07/08 data that informed the visit to the 7 colleges” The 75 LSC DLF LR funding audits more rigorous because: > Reconciliation (data = funding) reintroduced > Questions raised in letter regarding data management > LSC need funding back as over-allocated ALR > Budgets getting tighter, so can’t afford inflated claims What are the likely ‘appropriate sanctions’?

  16. Welcome Pete Ashton & Ed Drake

  17. Introducing ADaM ADaM is new ILR software which has been created in direct response to the LSC data letter It compares ILR files and will help you identify any relevant changes – before audit and inspection • Simplicity • Instant assessment • Detailed breakdown • Peace-of-mind • Compatibility • Multiple export formats • Significant discount for colleges here today from www.drakelane.co.uk/adam

  18. Break – return at 15.00 Note: Delegates returned questionnaires before lunch and the results were reviewed in the afternoon. The numbers her have been updated to include additional questionnaires returned at the end of the day.

  19. Results from questionnaire Q1. Do you think the guidance on LR ILR data collection for success rate purposes is clear? Answers: 63 Yes: 13% No: 87% Don’t know: 0%

  20. Results from questionnaire Q2. Was your college data recording practices contrary to guidance in Geoff Russell’s letter? Answers: 64 Yes: 63% No: 33% Don’t know: 5%

  21. Results from questionnaire Q3. If yes to Q2 were your 2008/09 success rates significantly (>2%) inflated, based on the letter? Answers: 50 Yes: 14% No: 70% Don’t know: 16%

  22. Results from questionnaire Q4. If yes to Q3, have you amended your college 2008/09 data as a consequence of the letter? Answers: 30 Yes: 13% No: 83% Don’t know: 3%

  23. Results from questionnaire Q5. If yes to Q4, will your 2008/09 success rates be lower than 2007/08 as a result of the changes? Answers: 21 Yes: 24% No: 62% Don’t know: 14%

  24. Results from questionnaire Q6. Will you be amending your college 2009/10 data collection practices as a consequence of the letter Answers: 64 Yes: 56% No: 33% Don’t know: 11%

  25. Results from questionnaire Q7. In terms of the credibility of your success rate data are you now concerned about an Ofsted inspection? Answers: 65 Yes: 31% No: 63% Don’t know: 6%

  26. Results from questionnaire Q8. Do you believe the Principal, Governors and senior team understand the importance of data credibility? Answers: 64 Yes: 50% No: 45% Don’t know: 5%

  27. Results from questionnaire Q9. On a different but data related topic, do you believe the A51a will become too complicated and unworkable? Answers: 62 Yes: 81% No: 6% Don’t know: 13%

  28. Results from questionnaire Q10. Once the guidance is clarified, would you trust the vast majority of other colleges to follow the rules? Answers: 63 Yes: 41% No: 38% Don’t know: 21%

  29. Data quality group Group to be established with LSC, IA, Data Service, Ofsted and College representation College representation, after call for volunteers by AoC: • Paul Head, Principal, The College of North East London, Haringey and Enfield • Graham Taylor, Principal, New College Swindon • Graham Razey, Vice Principal, Sussex Coast College, Hastings • John Callaghan, Vice Principal, Derby College • Adrian Clarke, MI Manager, The Grimsby Institute of Further & Higher Education • Tracy Clarke, Data & Curriculum Planning Manager, Bolton Community College

  30. Any further questions? Stay in touch via www.twitter.com/nicklinford

More Related