1 / 33

the international student experience

Background. Recent History

emily
Download Presentation

the international student experience

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


    1. The International Student Experience Background Exit Survey Results Transition to University from Year 12 English language abilities at admission and academic progress for postgraduate coursework students

    3. Background Students by country of origin

    4. Background Students from over 80 countries Rapid growth in students undertaking postgraduate coursework programs We hope to see this expand with the return of AusAID scholarship holders

    5. Exit Survey Results 2003 and 2004 Reasons for selecting the University of Adelaide Perceptions of the University Opinion of the support services provided Plans post completion Recommendations for change or improvement

    6. Exit Survey Results 2003 and 2004 186 respondents in 2003 207 respondents in 2004 Mix of undergraduate, study abroad, exchange, postgraduate coursework and research students from over 20 different countries of origin completed the survey.

    7. Reasons for selecting the University

    8. Main Findings - Perceptions Perceptions pre and post study. Students were asked rate their perceptions on 9 separate attributes, using a 3 part rating scale (good, average, poor) Quality of teaching, quality of research education, quality of program, profile of academic staff, graduate employment prospects, prestige, student focus, impressive alumni, and location Significant difference in student perception prior and at completion of study in 2003 were found for profile of staff, student focus and location (all showed positive change). In 2004 quality of program and location showed positive change.

    9. Main Findings – Support Services 95% rated their knowledge of support services as average or good. (96% in 2004) 87% highlighted that they felt comfortable accessing support services when required. (89% in 2004) 87% indicated that they felt comfortable seeking academic advice from school staff. (87% in 2004) 75% participated in the International Student Orientation Program. (74% in 2004)

    10. Main Findings – Support Services 31% attended academic skills or personal development workshops. (38% in 2004) 14% rated computer facilities as poor. (10.6% in 2004) Outranked by food outlets which 17% rated as poor in 2004) In 2003, 85% indicated that they would recommend the University to others. (9.4% said they would not) In 2004, 92.8% indicated that they would recommend the University to others.

    11. Circumstances impacting on academic performance

    12. Main Findings – Post Completion 52.8% indicated that they would consider further studies at the University. (64.3% in 2004) In 2003, 41.1% indicated that they would not undertake further studies, 7.8% indicated that their reasons were the result of dissatisfaction with the University. In 2004, 32.4% indicated that they would not undertake further studies, 7.2% indicated that their reasons were as a result of dissatisfaction with the University.

    13. Main Findings Recommendations for change – free field comments. Computing facilities – increase internet and printing quotas (2003 and 2004) Student Accommodation (2003 and 2004) Simplified enrolment processes (2003 only) AUU to provide more student activities and opportunities for increased interaction with local students (2003 and 2004)

    14. Main Findings Increase library opening hours (2003 and 2004) Access to student cards on arrival (2003 only) Availability of walk-in appointments at the Medical Centre (2003 and 2004) Need to raise the University profile off-shore (2004 only) Surprisingly few suggested lowering fees and increasing scholarships (2003 and 2004) In 2004, 27 respondents indicated appreciation for the support received.

    15. Transition to University from Year 12 125 international students started in 2004 (via SATAC) Individual Interviews Focus Groups – 40 students attended Challenges faced by students – student perceptions and staff perceptions Our challenges in engaging the students.

    16. Transition to University from Year 12 Semester 1, 2004 results 40.5% of the students passed the entire course load 16.5% failed one course, 20.0% failed 2 courses, 11.5% failed 3 courses, and 11.5% failed all courses they were enrolled in.

    17. Challenges – as told by students They have worked hard in high school— took a break at uni, relief from school studies. Students think they are familiar with the education system (over confident). Need to highlight the academic differences between school and uni Change the perception that orientation is about academic preparation on not just living in Australia.

    18. Challenges – as told by students Students found the major difference between school and university; 21 students commented on the freedom, independence and need to manage their time. 11 commented on the difference in the academic environment, “it is a do-it-by-yourself effort at uni” 7 commented on the lack of guidance compared with school, “ in school the teachers are there to help us; teacher pays more attention to the individual at school; there is no teacher to focus only on you anymore”.

    19. Challenges – staff perceptions Being an independent learner How to use non contact time effectively Poor match between subject undertaken at year 12 and course selection at university Accommodation – many move out of supported arrangements to live independently with friends

    20. Challenges – staff perceptions Knowing when and how to seek help Problem solving (head in the sand) Interacting within the university community Freedom vs family expectations Developmental issues (as with domestic students)

    21. Challenges – engaging students Students do not see the need to attend orientation Many perceive orientation is about living in Adelaide Competes with ‘home time’ (return flight booking are often made well in advance with the first day of lectures used as booking date) Follow poor advice

    22. Challenges – Engaging Students Services designed to engage students on arrival Students often feel confident that if they just turn up on first day of lectures everything will be okay. We have tried (still trying) to engage this group, after orientation with workshops – lunch time events This semester we ran workshops both in the mid-term break and after break in first week back (with free lunch). 130 invitation sent direct to students, 7 attended.

    23. Back to the Research From the 125 who commenced in 2004, 64 (51.2%) are enrolled in 2006. Often the first time we will meet (face-to-face) with the students who come to the University direct from school is following a semester or two of poor results.

    24. English language abilities at admission and academic progress for postgraduate coursework students

    25. Context English language entry requirements remained unchanged Changing programs, increasing numbers of PGCW students Creation of alternate entry methods Culture of attributing failure to English proficiency levels

    26. Grant A small research grant was applied for and achieved Original study looks at wider areas including research students

    27. Research Aims The research aims of the study were to investigate the academic performance and outcomes for IPC students with reference to their English language entry type and proficiency at admission. (this would then help inform policy)

    28. Research Aims Six English language entry types were used in this study. They can be broadly conceptualised into two distinct groups.

    29. Standardised measures based on testing Type 1. IELTS - the International English Language Testing System Type 2. TOEFL – the Test of English as a Foreign Language. It is important to note that TOEFL scores have been converted to an overall IELTS equivalency score for the purpose of this study. Type 3. IELTS or TOFEL plus a English Language Program (ELP). Entry with a conditional requirement to undertake and pass a specialised English Program, such as an Pre-enrolment English Program (ELP)

    30. English competence assumed Type 4. English is first language Type 5. 2/5 English – at least two of the student’s past five years of study were conducted in the English language. Type 6. English medium – the student has studied in the English language medium for at least five years in Australia, New Zealand, the United Kingdom, Canada, the United States of America, South Africa or the Republic of Ireland

    31. Research Questions (1) What is the relationship between English language entry type (IELTS, TOEFL, English medium, 2/5 study, English first language speakers) of IPC students and their academic performance at university as measured by GPA? This is the whole cohortThis is the whole cohort

    32. Research Questions (2) What is the relationship between English language proficiency (as measured by IELTS or IELTS equivalent scores) of IPC students and their academic performance at university as measured by GPA? This is IELTS, TOFEL students onlyThis is IELTS, TOFEL students only

    33. Research Questions (3) What are the advantages in terms of academic outcomes as measured by GPA in raising the current minimum IELTS overall entry score of 6.0 for international postgraduate coursework programs. The first two questions are intended to provide information that can assist in answering the final research question:

    34. DATA COHORT To establish the data pool for (IPC) students the following cohorts were used: 2001: intakes from Semesters 1 and 2 and Trimesters 1, 2 and 3 2002: intakes from Semesters 1 and 2 and Trimesters 1, 2 and 3 2003: intakes from Semester 1 and Trimester 1

    35. DATA COHORT The data comprised records for 331 postgraduate coursework students. Sex: Male = 171 (51.7%) Female = 160 (48.38%) Age range: 21.1 to 53.5 years with a mean of 28.8

    36. DATA COHORT Overall the cohort came from 46 countries with the top ten being: China (18.8%); Malaysia (10.1%); India (8.1%); United States (8.1%); Japan (7.5%); Indonesia (5.4%); Singapore (3.6%); Thailand (3.6%); United Kingdom (3.6%); Hong Kong (3.3%)

    37. Research Q1

    38. Research Q1 As at June 2005, 84% of the IPC cohort had successfully finished their programs with 8% of IPC students continuing their study and another 8% having withdrawn or failed. DEST (2004) data on attrition rates provides a useful comparison. Attrition rates for the University used in the study are reported for both domestic and international students. An example of attrition is given to help define the use of the term by DEST(2004, p.2), students who were enrolled in an award course in 2002 were not enrolled at that University in an award course in 2003, and their non enrolment was not due to the completion of their course in 2002. This matches the definition of non completion used in this research. The DEST reported data for 2001 and 2002 is combined and states that 22.3% of domestic postgraduate students did not complete their programs. As at June 2005, 84% of the IPC cohort had successfully finished their programs with 8% of IPC students continuing their study and another 8% having withdrawn or failed. DEST (2004) data on attrition rates provides a useful comparison. Attrition rates for the University used in the study are reported for both domestic and international students. An example of attrition is given to help define the use of the term by DEST(2004, p.2), students who were enrolled in an award course in 2002 were not enrolled at that University in an award course in 2003, and their non enrolment was not due to the completion of their course in 2002. This matches the definition of non completion used in this research. The DEST reported data for 2001 and 2002 is combined and states that 22.3% of domestic postgraduate students did not complete their programs.

    39. Research Q1 There are some limitations when interpreting this data. It is problematic to compare the mean GPA scores for the different study periods as each study period after the first is represented by a smaller group. In addition, no account has been taken here of programs of different lengths, so that a student with a GPA in study period 5, for example, may be studying a long program, or they may be taking longer for a number of other reasons. There are some limitations when interpreting this data. It is problematic to compare the mean GPA scores for the different study periods as each study period after the first is represented by a smaller group. In addition, no account has been taken here of programs of different lengths, so that a student with a GPA in study period 5, for example, may be studying a long program, or they may be taking longer for a number of other reasons.

    40. Research Q1 As can be seen from Table 9, there were differences between the students dependent on entry types. If your entry was based on an English Medium entry type the average overall GPA was 5.8 compared to average overall GPA scores of 4.4 for those who entered through the ELP program, 4.8 for TOEFL and 5.0 for IELTS. It is useful to collapse the six English language entry types used in this study into the two distinct groups. The first grouping types 1-3 represents standardised measures of English language proficiency based on testing and the second grouping types 4-6 represents different measures of linguistic and educational history in English from which a level of English competence or proficiency is assumed. The IPC students who were admitted based on English language proficiency based on testing had an overall mean GPA of 4.83. The IPC students who were admitted based on measures of linguistic and educational history in English from which a level of English competence or proficiency is assumed had an overall mean GPA of 5.17, which is 0.63 of a GPA point higher. As can be seen from Table 9, there were differences between the students dependent on entry types. If your entry was based on an English Medium entry type the average overall GPA was 5.8 compared to average overall GPA scores of 4.4 for those who entered through the ELP program, 4.8 for TOEFL and 5.0 for IELTS. It is useful to collapse the six English language entry types used in this study into the two distinct groups. The first grouping types 1-3 represents standardised measures of English language proficiency based on testing and the second grouping types 4-6 represents different measures of linguistic and educational history in English from which a level of English competence or proficiency is assumed. The IPC students who were admitted based on English language proficiency based on testing had an overall mean GPA of 4.83. The IPC students who were admitted based on measures of linguistic and educational history in English from which a level of English competence or proficiency is assumed had an overall mean GPA of 5.17, which is 0.63 of a GPA point higher.

    41. Research Q1 Table 10 demonstrates that, while few students failed, they were more likely to fail if they were from the IELTS / TOEFL groups. Caution must be used when generalising these results, as the number of failing students is very small overall. Significant differences in GPA between the entry type groups in study periods 1 and 2 require further exploration. If a student has not been admitted via a standardised test then the English language entry type distinguishes, to an extent, international students’ English language ability when they begin study but does not measure it directly. It might be expected that GPA scores of the different groups would become more similar with study experience in Australia, and this may be part of the reason why the differences among the language entry groups after study period 2 were not significant. But the significant differences in GPA between the entry type groups in study periods 1 and 2 suggest that something is happening that may be beneficial. The research does not investigate what post-enrolment support programs for the IELTS and TOEFL groups are. They may be academic support or social support. The higher failure rate among IELTS / TOEFL entry groups (using overall GPA) also suggests that English language proficiency may be an important factor for these groups during their study in Australia. It would be interesting to retest students and see what results IELTS tests would produce after their first and second study periods. Table 10 demonstrates that, while few students failed, they were more likely to fail if they were from the IELTS / TOEFL groups. Caution must be used when generalising these results, as the number of failing students is very small overall. Significant differences in GPA between the entry type groups in study periods 1 and 2 require further exploration. If a student has not been admitted via a standardised test then the English language entry type distinguishes, to an extent, international students’ English language ability when they begin study but does not measure it directly. It might be expected that GPA scores of the different groups would become more similar with study experience in Australia, and this may be part of the reason why the differences among the language entry groups after study period 2 were not significant. But the significant differences in GPA between the entry type groups in study periods 1 and 2 suggest that something is happening that may be beneficial. The research does not investigate what post-enrolment support programs for the IELTS and TOEFL groups are. They may be academic support or social support. The higher failure rate among IELTS / TOEFL entry groups (using overall GPA) also suggests that English language proficiency may be an important factor for these groups during their study in Australia. It would be interesting to retest students and see what results IELTS tests would produce after their first and second study periods.

    42. Research Q1 - Findings Among the coursework student cohort most students (93.5%) achieved an overall GPA of pass or higher. While few failed they were more likely to do so if they were from the IELTS / TOEFL entry types. In the first two study periods, the mean GPA scores for coursework students whose first language was English were significantly higher than those for students who had entered via the IELTS or TOEFL path.

    43. Research Q2 Because of the changing student demographics we are expecting (currently experiencing) more students to be entering using standardised measures based on testing

    44. Research Q2 The IELTS test provides individual subtest (band) scores for reading, writing, speaking and listening, as well as providing an overall score. For IPC students, scores for the four subtests were available for 89 students (23.3%). Overall IELTS scores for the IELTS students and equivalent scores for TOEFL-admitted students (calculated from the TOEFL scores) were available for 152 coursework students. A conversion table enabled TOEFL scores to be calculated in terms of IELTS equivalency. For coursework students this increased the sample size in the analysis to 152 students with an overall IELTS score. There is no conversion for any subtest scores equivalence between IELTS and TOFEL The IELTS test provides individual subtest (band) scores for reading, writing, speaking and listening, as well as providing an overall score. For IPC students, scores for the four subtests were available for 89 students (23.3%). Overall IELTS scores for the IELTS students and equivalent scores for TOEFL-admitted students (calculated from the TOEFL scores) were available for 152 coursework students. A conversion table enabled TOEFL scores to be calculated in terms of IELTS equivalency. For coursework students this increased the sample size in the analysis to 152 students with an overall IELTS score. There is no conversion for any subtest scores equivalence between IELTS and TOFEL

    45. Research Q2

    46. Research Q2

    47. Research Q2 While none of the differences were statistically significant and the numbers of non-successes were small, this finding suggests that IELTS / TOEFL-admitted students with entry scores below 7.5 may need extra support. If we compare Table 13 with Table 11 we can make the following observations. Students entering with an IELTS score of 6.0 had a rate of successful completion of 75.0%, compared with an overall completion rate for the whole cohort of 85.4%. It could be predicted using our data, and using the current coursework cohort as a guide, that an IELTS entry score of 6.5 or above could raise the successful completion rate of students entering with an IELTS test to 86.7%, which is 1.2% higher than the current overall completion rate. This assumes that all students with IELTS scores below 6.5 would be excluded, which is not the case with the current cut-off score of 6.0. A more realistic suggestion from the data might be that students with lower English language proficiency scores at entry need more support than those with higher scores. The figures in Table 13 are also instructive when considered in a different way. At entry scores 6.0 and below, 73.5% of the students were successful, and at entry scores of 6.5 or higher, 13.3% of students were not successful. Overall, 40 per cent of non-completions were among entry scores of 6.5 and higher. Hence, raising the entry score above 6.0 might eliminate more potentially successful students than potential failures, while retaining some that might not be successful. Clearly there is much more to success and failure than the English language proficiency scores with which students enter the University. Overall, the IELTS categories represent only about half of the IPC students in this study, and the outcomes of students entering via other pathways have not been considered in this discussion. Many variables other than entry score will impact on the success or otherwise of international students. For example, a changing country of origin for the majority of the University’s international student cohort alone will have an impact because the student population represented in this study will not be the same as that in the future. While none of the differences were statistically significant and the numbers of non-successes were small, this finding suggests that IELTS / TOEFL-admitted students with entry scores below 7.5 may need extra support. If we compare Table 13 with Table 11 we can make the following observations. Students entering with an IELTS score of 6.0 had a rate of successful completion of 75.0%, compared with an overall completion rate for the whole cohort of 85.4%. It could be predicted using our data, and using the current coursework cohort as a guide, that an IELTS entry score of 6.5 or above could raise the successful completion rate of students entering with an IELTS test to 86.7%, which is 1.2% higher than the current overall completion rate. This assumes that all students with IELTS scores below 6.5 would be excluded, which is not the case with the current cut-off score of 6.0. A more realistic suggestion from the data might be that students with lower English language proficiency scores at entry need more support than those with higher scores. The figures in Table 13 are also instructive when considered in a different way. At entry scores 6.0 and below, 73.5% of the students were successful, and at entry scores of 6.5 or higher, 13.3% of students were not successful. Overall, 40 per cent of non-completions were among entry scores of 6.5 and higher. Hence, raising the entry score above 6.0 might eliminate more potentially successful students than potential failures, while retaining some that might not be successful. Clearly there is much more to success and failure than the English language proficiency scores with which students enter the University. Overall, the IELTS categories represent only about half of the IPC students in this study, and the outcomes of students entering via other pathways have not been considered in this discussion. Many variables other than entry score will impact on the success or otherwise of international students. For example, a changing country of origin for the majority of the University’s international student cohort alone will have an impact because the student population represented in this study will not be the same as that in the future.

    48. Research Q3 This research shows, however, that if the IELTS or equivalent entry score were raised to 6.5 overall, there would be little significant gain in GPAs. On these figures, raising the entry score for these groups of students could improve their grade point averages by 0.2. Put another way, leaving the entry score at 6.0 would result in similar grade point averages.

    49. Research Q3 Raising the cut-off score from 6.0 to 6.5 would require alternative pathways for or deny entry to 46% of students among the current IPC student cohort who entered the University via a language test path (n=152). Hence raising the entry score above 6.0 might eliminate more potentially successful students than potential failures, while retaining some that might not be successful.

    50. Clearly there is much more to success and failure than the English language proficiency scores with which students enter the University.

More Related