october 2008 n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
October 2008 PowerPoint Presentation
Download Presentation
October 2008

Loading in 2 Seconds...

play fullscreen
1 / 22

October 2008 - PowerPoint PPT Presentation


  • 83 Views
  • Uploaded on

Using qualitative data to prove and improve quality in Australian higher education Geoff Scott, Leonid Grebennikov and Mahsood Shah Office of Planning and Quality University of Western Sydney. October 2008. Outline. Introduction

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

October 2008


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
october 2008

Using qualitative data to prove and improve quality in Australian higher education Geoff Scott, Leonid Grebennikov and Mahsood Shah Office of Planning and QualityUniversity of Western Sydney

October 2008

outline
Outline
  • Introduction
    • Limited use of qualitative data in institutional performance assessment
    • Advantages and benefits of using qualitative data
    • UWS experience in the systematic analysis of the qualitative data from student feedback surveys
  • Method
    • CEQuery qualitative analysis tool
    • Comparative analysis of qualitative data generated by three key UWS student surveys
  • Results and discussion
  • Implications
qualitative data in institutional performance assessment
Qualitative data in institutional performance assessment
  • Receive limited attention
  • Cover aspects of student experience which are untapped in existing evaluations
  • Identify reasons for statistical results which may be different from what researchers assume
  • Define in students’ own words what they find important
  • Should complement quantitative data
uws experience in the systematic analysis of the qualitative data from student feedback surveys
UWS experience in the systematic analysis of the qualitative data from student feedback surveys
  • Since 2006 all UWS student surveys covering

- overall experience at the University level - particular course or program - specific subjects

invite respondents to answer two questions in their own words:

  • What were the best aspects of their course/unit?
  • What aspects of their course/unit are most in need of improvement?
uws experience in the systematic analysis of the qualitative data from student feedback surveys1
UWS experience in the systematic analysis of the qualitative data from student feedback surveys
  • Written comments are automatically classified by the CEQuery qualitative analysis tool into five domains and 26 subdomains using a custom-tailored dictionary.
  • CEQuery results are integrated into Annual Course and Unit Reports in order to better identify key ‘hot spots’ for improvement and actual solutions from the student perspective.
  • Actual comments can be viewed once sorted into specific CEQuery domains and subdomains.
  • High importance areas are used in course accreditation and review, and to validate rating items on surveys.
comparative analysis of qualitative data from three key student surveys
Comparative analysis of qualitative data from three key student surveys
  • Survey 1: Covers total university experience; sample – 3,492 current students; 9,410 written comments
  • Survey 2: The national CEQ covers graduate experience of the course just completed; sample – 2,734 respondents; 4,213 written comments
  • Survey 3: Evaluates individual subjectseach time they are offered; sample – about 200,000 students each year; 94,803 written comments
about cequery
About CEQuery
  • Best Aspect (BA) and Needs Improvement (NI) ‘hits’ are coded and sorted into domains then subdomains.
  • 5 domains – Assessment, Course Design, Outcomes, Staff, and Support, and 26 subdomains
  • Hit rate – 80%, allocation accuracy – 90%
  • BA + NI = Importance
  • BA / NI = Quality
  • Custom-tailored dictionary
cequery subdomains
CEQuery subdomains
  • Assessment
    • Expectations
    • Feedback
    • Marking
    • Relevance
    • Standards
cequery subdomains1
CEQuery subdomains

Assessment: Expectations

Provision of clear assessment tasks and expectations on how to tackle and present them; clear submission deadlines, guidelines rules and grading criteria. Provision of examples of work, to give an operational picture of different grades and quality of work in each subject.

Typical NI comments

Expectations for assignments need to be clearer

Lack of clear criteria for marking

More explanations than just expecting us to know or guess

Better description of tasks

cequery subdomains2
CEQuery subdomains

Assessment: Feedback

Promptness with which assignments are returned, use of staged deadlines, quality of the feedback received including the extent to which markers comment on what was done well, explicitly identify key areas for improvement and say how improvements could have been achieved – with specific attention to the grading criteria distributed at the start of the subject.

Typical NI comments

I’m still trying to get back an assignment over 5 months old

When returning essays tutors should give more detailed feedback so students know exactly how to improve work

We only received one assessment back before the subject finished

cequery subdomains3
Course Design

Flexibility

Learning methods

Practice-theory links

Relevance

Structure

Outcomes

Further learning

Intellectual

Interpersonal

Personal

Knowledge/skills

Work application

CEQuery subdomains
cequery subdomains4
Staff

Accessibility

Practical experience

Quality & attitude

Teaching skills

Support

Infrastructure

Learning resources

Library

Social affinity

Student administration

Student services

CEQuery subdomains
more information on cequery
More information on CEQuery

Scott, G. (2006). Accessing the student voice: Using CEQuery to identify what retains students and promotes engagement in productive learning in Australian higher education.

http://www.dest.gov.au/sectors/higher_education/publications_resources/profiles/access_student_voice.htm

slide14
The CEQuery study of comments from students in 14 universities - key implications for student retention and engagement
  • It is the total experience that counts.
  • Teaching is not learning
  • Learning is a profoundly social experience.
  • Need for more research on how various forms of IT-enabled learning do and do not add value as part of a broader learning design
  • 60 learning methods, especially active and practice oriented ones depending on FOE and level of study
  • Traditional lectures and class-based methods must be seen as just one of the options not the sole one.
discussion
Discussion
  • Why do very important CEQuery subdomains demonstrate patchy results in terms of quality?

- Variety of factors shaping student experience

- Extent of multi-campus university operation

six areas of student experience that warrant an improvement focus
Six areas of student experience that warrant an improvement focus
  • Assessment (standards, marking, expectations management and feedback)
  • Student Administration
  • Course Structure
six areas of student experience that warrant an improvement focus1
Six areas of student experience that warrant an improvement focus

‘High-hit’ CEQuery Subdomains with low BA / NI ratios

six areas of student experience that warrant an improvement focus2
Six areas of student experience that warrant an improvement focus
  • Assessment (standards, marking, expectations management and feedback)
  • Student Administration
  • Course Structure
  • Staff: Quality and Attitude (at the overall university level)
  • Student Support: Infrastructure (course and subject level)
  • Student Support: Learning Resources (course level)
slide21
UWS improvement actions based on quantitative and qualitative data (analysed via CEQuery) from student feedback surveys
  • Introduction of online enrolment
  • Implementation of the online complaints resolution system
  • New assessment policy
  • Introduction of assessment focused self-teaching guides for each subject
  • A range of new, targeted transition support programs
  • A number of new, free study assistance workshops and programs
  • Use of the more interactive version of the online learning system.
  • More opportunities for practice-based learning, e.g., through increased engagement with regional employers and industry bodies
  • Results: Improvement in CEQ OS by 10% in three years, retention by 4%
concisely the systematic analysis of qualitative data helps
Concisely, the systematic analysis of qualitative data helps:
  • Generate a more focused and evidence-based set of ‘good practice’ guidelines and areas for quality improvement down to the course and unit level
  • Ensure that course and subject design focus on what counts for students, as courses and units are implemented and reviewed
  • Inform what is and is not tracked in quantitative surveys, validate the items in these surveys to ensure they cover what is really important to students
  • Assist in making staff development programs more relevant by providing BA and NI comments regarding each course and unit to relevant teaching and administrative staff
  • Complement the quantitative data that are typically used to inform decision-making for the area