1 / 24

The Design of a Quality Assurance System in Higher Education – Selecting Key Performance Indicators

The Design of a Quality Assurance System in Higher Education – Selecting Key Performance Indicators . Kerstin V. Siakas Alexander Technological Educational Institute of Thessaloniki, Department of Informatics, Greece siaka@it.teithe.gr. Structure of presentation.

zan
Download Presentation

The Design of a Quality Assurance System in Higher Education – Selecting Key Performance Indicators

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Design of a Quality Assurance System in Higher Education – Selecting Key Performance Indicators Kerstin V. Siakas Alexander Technological Educational Institute of Thessaloniki, Department of Informatics, Greece siaka@it.teithe.gr

  2. Structure of presentation • Towards an Open European Higher Education Area • Meeting the challenges – a pilot study of the department of Informatics, ATEI of Thessaloniki, Greece • Experiences from the design of the • Self-Assessment System • Design of • the self-assessment process • Key Performance Indicators (KPIs) 2. Quality Assurance System • Investigating • ISO 9001:2000 • Balanced ScoreCard (BSC) • Lessons learnt • Future work

  3. Towards an Open European Higher Education Area • The Bologna Process • A common frame for diverse national systems • Transparent and mutually recognised educational systems • A National Quality Assurance Systems required by 2005 in the member states including: • A definition of the responsibilities of the bodies and institutions involved • Evaluation of programmes or institutions, including internal assessment, external review, participation of students and the publication of results • A system of accreditation, certification or comparable procedures, international participation, co-operation and networking

  4. Aims and objectives of a quality assurance system in HE • to create a ground for • visibility into the processes that support the study programme • measurements of learning outcome, capabilities and competences (what the graduate is able to do) • to support a system of continuous improvement Consistent with the principle of institutional autonomy, the primary responsibility for quality assurance in HE lies with each institution itself

  5. Quality Assurance • The educational assessment is established around four questions : • Which are the objectives of the educational institution? • How does it try to achieve them? • How does it know that it has succeeded in the achievement of the objectives? • What are the changes for success?

  6. The department of Informatics of the ATEI-Thessaloniki, GreeceMeeting the Challenge of the BOLOGNA PROCESS • A pilot project sponsored by the European Programme EPEAEK II (Operational Programme "Education and Primary Vocational Training") programme was carried out 2005-2006 • The working group included lecturers & students • Aims of the project was the creation of a system for: • Curriculum Assessment • Quality Assurance • Final report submitted July 2006

  7. Phase One: Preparation • Definitions of • goals • milestones • roles and responsibilities Project Planning • The Bologna Process • Assessment in HE • Existing Laws & Regulations • Educational Quality Systems Literature Review Key Indicator Establishment Goal Question Metric Methodology

  8. Activities • Decision about goals and objectives of the assessment (why to measure) • Team building, roles and responsibilities (who will do what - assignments) • Setting up time-table and milestones (integration of tasks) • Ensuring that educational processes and procedures are understood (agreement on vocabulary and meaning of different educational issues) • Selection of research instrument (how to collect data) • Selection of research population (where to find information, whom to ask) • Design of Key Performance Indicators (KPIs) (what to measure) • Design of targets for comparison (what is the meaning of measures)

  9. Design of Key Performance Indicators (KPIs):Using the Goal-Question-Metric (GQM) Methodology Goal: Continuous briefing on new products / technologies Question: How many presentations / informative seminars are realised? Metrics: Seminars organised by companies, department, academic staff, students Question: What is the participation rate in research? Metrics: Participation of academic staff (and students) in research programmes Number of publications

  10. KPIs: Study Programme (Curriculum) • Theory – tutorials – labs • Knowledge Coverage of the discipline of Informatics • Workload • Quality of study material • Assessment methodology • Industrial placement & Final year project • Suitability to study programme • Workload • Scientific Level • Assessment methodology • Accomplishment of market requirements • Accomplishment of requirements for postgraduate studies

  11. KPIs: Academic Staff • Availability of academic staff • Level of Knowledge in the discipline of Informatics • Teaching ability • Motivation ability • Research potential • Use of Technical support material • Building • Technological KPIs: Infrastructure

  12. KPIs: Student Support • Secretarial • Information • Organisation of seminars, workshops etc. • Socrates – Leonardo da Vinci • Careers office • Library

  13. Phase Two: Data Collection & Analysis • Questionnaires • Interviews • Observation • Databases Data Collection • Quantitative, SPSS • Qualitative Data Analysis

  14. The Survey

  15. Phase Three: Continuous Improvement • Reports • Internal meetings • Workshops • Publications • Conferences, Journals Dissemination of Results • Cater for • commitment • financial support • what to improve Defining of Action Plan

  16. Experiences from the self-assessment • Advantages • Improved experiences in assessment • Improved understanding of indicators for the quality assurance system • Easier appreciation of necessary changes • Disadvantages • Increased workload • Lack of commitment from management and colleagues

  17. Design of the Quality Assurance System (QA) • The results from the self-assessment was the underlying base for design of the QA system • SWOT analysis to find the Strengths - Weaknesses-Opportunities–Threats of existing processes and practices • The objectives and the requirements of the system have to be clear before the design of the Quality Assurance System • A broader societal view is required The question is not if students are ready for the educational institutions and processes but whether the institutions and the processes are ready for the students

  18. ISO9001:2000: Preparation and design • Preparation and design • Understanding of the requirements • Assessment of the situation • Proposals • Assurance of commitment • Action plan Plan • Deming’s P-D-C-A Cycle to follow up that: • the action plan is accomplished • the educational institution has succeeded • in the achievement of the objectives Act Do Check

  19. ISO 9001:2000 Application • Policy and management assurance • Selection and training of management representative and application leader • Internal review • Improvements of documentations • Selection of accreditation body • Preliminary visit • Preliminary assessment and preparation stage • Assessment • Accreditation • Maintenance

  20. The Balanced Scorecard (BSC) Methodology Four Perspectives: • Financial • Customer (students) • Internal procedures (the internal procedures of the institution) • Learning and improvement (ability of constant training aiming at continuous improvement and competitiveness) Steps: The institution • sets goals & targets for each of the perspectives • collects the evidences to verify the performance and quality level (questionnaires, databases etc.)

  21. Lessons learnt from the pilot study The transformation from bad practices to survival and competitive success require: • An institutional culture change • Management commitment

  22. The situation today in the institution • Assessment is a fact • A QA committee on institutional level is created • Nation wide questionnaires were distributed in Sept 2007 to all HE institutions in Greece to be used as self-assessment • Every department is asked to create a QA group to take action (within their normal duties) on the new directives

  23. Further work In the department of Informatics • The creation of the QA group • Discussions of the results from the previous self-assessment • Investigation of the coverage of key performance indicators of the new nation wide questionnaire • Decide on questionnaire to use (in its current form or with extension of other questions) • Creation of an on-line version of the questionnaire • Carry out assessment within the academic year 2007-2008 • Dissemination of results

  24. Thank You!Questions? Kerstin V. Siakas siaka@it.teithe.gr

More Related