1 / 14

Dominique Davison & Dr Doug Carr

Key Changes and Timelines for Revised Annual Monitoring Reporting (AMR) and Validation and Approval (V&A) Processes. Dominique Davison & Dr Doug Carr. Objectives for Session:. review drivers for change review key aspects of revisions to V&A & AMR processes

Download Presentation

Dominique Davison & Dr Doug Carr

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Key Changes and Timelines for Revised Annual Monitoring Reporting (AMR) and Validation and Approval (V&A) Processes Dominique Davison & Dr Doug Carr

  2. Objectives for Session: • review drivers for change • review key aspects of revisions to V&A & AMR processes • review new timelines associated with V&A & AMR processes • contextualise faculty response to initiatives

  3. Drivers for Change: • Changing (external) attitudes about management of quality (enhancement & risk focused) • Internal refinements to the way we manage quality: Quality Strategy (AB/90/06) approved by AB 2006) • The recognised need to move away from systems of quality inherited from CNAA and built up over years of being overly cautious and focused on finding faults

  4. New Approach to Validation & Approval [1]: key problems with “old” approach • Too onerous for academic and administrative staff • Based on process rather than academic debate • Not responsive to external demands • Not fit for purpose (eg flexibility) • Did not reflect the Quality Strategy (with its risk focus and emphasis on enhancement)

  5. New Approach to Validation & Approval [2]: • Planning Approval: • Planning Approvals Panel replaced by • Academic Development Committee (ADC) • Planning Approval for ‘internal’ programmes: • conferred to FMB with QED identifying • approval • processes and support needs • Planning Approval for ‘collab’ programmes: • conferred to ADC who identifies approval • processes

  6. New Approach to Validation & Approval [3]: • Validation and Approval Processes: 4 potential • routes - University Panel, Faculty Panel, Fast- • track & paper-based • Validation Panels: • Drawn from Validation and Review Standing • Panel (VARSP) and set up by QED • Variations from current constitution (eg • independent chair; 2 externals)

  7. New Approach to Validation & Approval [4]: • Streamlined validation documentation: • Programme Handbook (which incorporates the • Programme Specification) • Matrix + Module Specs (inc module specific • resources) • Distributed Learning: withdrawal of External • Peer Review process

  8. New Approach to Validation & Approval [5]: • Collaborative arrangements: - Revised Operational Manual - Use of Accredited Lecturer form to approve partner staff teaching on programmes • Support for Development Teams: - by QED (e.g. newly created post: Sandy Cope) - by EHS (e.g. ‘EHS QA leads’)

  9. New Approach to Validation & Approval [6]: key timelines for new approach • EHS Executive approval of FMB submission and business plan before FMB • Validation date: agreed with QED (8 wks after planning approval) • QED Programme Structure (PS) Check (3 wks before validation date) • 3 development sessions with EHS QA lead (every 3 wks starting from 14 wks before validation) • Final EHS review: 5 wks before validation • Revisions/response to panel: 4 wks after validation • Final sign off by Chair: 6 wks after validation

  10. New Approach to Annual Monitoring [1]: key problems with “old” approach • retrospective and overly descriptive in nature (which tended to discourage a forward looking approach) • review of monitoring was time consuming • “dilution” of issues • little formal incentive for subject areas to respond to programme level issues • lack of formal (and necessary) link to business planning

  11. New Approach to Annual Monitoring [2]: key changes for new approach • Remove requirement of using module reports • Revise programme report (+ collaborative if applicable) so that it becomes more forward looking & risk aware • Introduce a School level report • Introduce Facilities Reports to help Faculties in their planning • Introduce Faculty Supplement which links to Faculty Development Plans

  12. New Approach to Annual Monitoring [3]: key dates for new approach Collaborative partner reports (for UG programmes) submitted to Programme Leaders Monday 1st Oct Monday 22nd Oct Submission of Programme Reports for ALL UG programmes to Heather Kemp Monday 29th Oct Reports from Collaborative partners (for PG programmes) submitted to Programme Leaders Monday 26th Nov Submission of Programme Reports for ALL PG programmes to Heather Kemp Friday 21st Dec Submission of School Reports (authored by Heads of Schools) to Heather Kemp Thursday 31stJan Submission of Faculty Supplement (authored by the Dean) to QED

  13. Closing thoughts on the new approach: • Aspects of the new processes for AMR & VA will be refined in both the short and longer term • It will take some time to embed these new processes and be able to see the benefit • Change will require a cultural shift • Focus of the Faculty response to changes will be supportive and facilitative

  14. Key Changes and Timelines for Revised Annual Monitoring Reporting (AMR) and Validation and Approval (VA) Processes Dominique Davison & Dr Doug Carr

More Related