1 / 16

IELTS and the Academic Reading Construct

IELTS and the Academic Reading Construct. Tony Green Cyril Weir Centre for Research in English Language Learning and Assessment. The researchers would like to acknowledge the support of the British Council in funding this study. Test validation from the user perspective.

ace
Download Presentation

IELTS and the Academic Reading Construct

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. IELTS and the Academic Reading Construct Tony Green Cyril Weir Centre for Research in English Language Learning and Assessment The researchers would like to acknowledge the support of the British Council in funding this study

  2. Test validation from the user perspective CRELLA programme of research to explore how far IELTS academic reading test reflects the reading practices of university students. • analysis of undergraduate texts vs IELTS academic reading texts • analysis of student vs IELTS academic reading tasks • student reading processes vs IELTS academic reading test taking processes CRELLA University of Bedfordshire

  3. Comparisons between IELTS and undergraduate reading Weir et al. (2007) compared IELTS academic reading to student experiences based on survey of 1,000 UoB students IELTS was said to under-represent: • expeditious reading skills (requires avg. reading speed of c. 60 wpm) • integration of information beyond the sentence level • information at level of the whole text • information accessed across texts Current study intended to extend self-report data to larger sample of test takers in variety of contexts. CRELLA University of Bedfordshire

  4. Instruments IELTS academic reading test IELTS academic reading has 3 parts 1 Test Part has an input text of c.800 (min 586 – max 1036) words and 13 or 14 associated questions. Used 2 IELTS academic reading tests from C.U.P. Cambridge Practice Tests for IELTS:Volume 2 (released material that has passed through Cambridge ESOL test development procedures). These… • Only employed currently approved Q types (see www.ielts.org) • Required both explicit and implicit information sources • Were judged to encourage both expeditious and careful reading types • Contained texts well within typical IELTS ranges for readability, vocabulary range and syntactic complexity CRELLA University of Bedfordshire

  5. Instruments Retrospection form Groups of students were administered one TestPart (20 minutes) Test Part = 1 text + up to 4 Sections of different Q types = 13/ 14 Qs Followed by a retrospection form eliciting… • Background information (age, gender, L1, nationality, previous IELTS, uni. subject) • Text preview – did test takers read the text before looking at the questions? • Strategies for responding – how did test takers go about looking for the answers? • Information base for the response – where did the the test takers find the information they needed to answer the questions? CRELLA University of Bedfordshire

  6. Participants Background and score levels 352 participants 40 - 74 participants per Test Part 16 languages 79% L1 Chinese, 4% Arabic, 4% Thai 59% female Median age 22 Divided into 3 broad score levels, loosely interpreted (based on equivalences suggested at www.ielts.org) as representing… 0-5 points c. IELTS 5.5 or below 6-8 points c. IELTS 6.0 9+ points c. IELTS 6.5 or above CRELLA University of Bedfordshire

  7. Text Preview CRELLA University of Bedfordshire

  8. Text Preview • Over ½ of all report quickly and selectively previewing text • Highest scoring test takers less likely to preview the text • Lowest scoring most likely to preview slowly, carefully 1: slowly, carefully, 2: quickly, selectively, 3: no preview CRELLA University of Bedfordshire

  9. Response strategies CRELLA University of Bedfordshire

  10. Response strategies Most and least popular strategies 83% use ST2: quickly match words that appeared in the question with similar or related words in the text 77% use ST10: read relevant parts of the text again 76% use ST3: look for parts of the text that the writer indicates to be important 8% use ST8: use my knowledge of grammar CRELLA University of Bedfordshire

  11. Response strategies Differences by level ANOVA reveals differences in strategy use by level for: Used more often by higher scoring test takers ST2 quickly match words that appeared in the question with similar or related words in the text ST10 read relevant parts of the text again Used more often by lower scoring learners ST5 work out the meaning of a difficult word in the question CRELLA University of Bedfordshire

  12. Response strategies Patterns by item type (Test Section) Example ST3 look for parts of the text that the writer indicates to be important ST4 read key parts of the text such as the introduction and conclusion Both associated with higher scores on the following item set: Choose the most suitable heading for paragraphs A-G from the list of headings below. i Common objections ii Who's planning what iii This type sells best in the shops iv The figures say it all v Early trials vi They can't get in without these vii How does it work? viii Fighting corruption ix Systems to avoid x Accepting the inevitable CRELLA University of Bedfordshire

  13. Location of necessary information L1 within a single sentence L2 by putting information together across sentences L3 by understanding how information in the whole text fits together L4 without reading the text L5 could not answer the question CRELLA University of Bedfordshire

  14. Location of necessary information Test E CRELLA University of Bedfordshire

  15. Location of necessary information Test F CRELLA University of Bedfordshire

  16. Conclusions Response strategies cannot be assumed from item type or predicted with sufficient accuracy via expert judgement Protocol forms potentially of great value in routine piloting Can highlight issues with particular items as part of the item QA process – e.g. ‘guessability’ Can help to confirm that required range of reading skills are addressed in every test form IELTS test takers do locate necessary information across sentences, but whole text level not always required use more expeditious reading strategies than predicted from Weir et al 2007, but few items require these CRELLA University of Bedfordshire

More Related