1 / 27

Integrated Skills Assessment

Integrated Skills Assessment. BALEAP 2019 Conference, Leeds 14 th April, 2019 Peter Davidson Zayed University, Dubai. Introduction. EAP teachers have begun to explore the use of integrated skills tests (Cumming, 2013)

tranthamj
Download Presentation

Integrated Skills Assessment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Integrated Skills Assessment BALEAP 2019 Conference, Leeds 14th April, 2019 Peter Davidson Zayed University, Dubai

  2. Introduction • EAP teachers have begun to explore the use of integrated skills tests (Cumming, 2013) • However, many teachers are unsure of which skills to integrate, and they lack the confidence to write and implement integrated skills tests. • For many EAP teachers, an integrated skills test means a reading-into-writing task (Weigle, 2004) • But can it be more than that?

  3. Outline • What’s wrong with an impromptu EAP writing task? • What is an integrated-skills test? • Why use an integrated-skills test? • Cognitive processes of a writing task • Examples from testing bodies • Examples from institutions • Example from Zayed University • Challenges of using an integrated-skills test • Overcoming these challenges

  4. 1. What’s wrong with an impromptu EAP writing task? • not implemented at university • lacks the complexity of real university-level writing • lacks cognitive validity (the extent to which the writing task resembles a real-world academic writing event (Shaw & Weir, 2007) • lacks context validity (Weir, 2005) related to the performance conditions: • purpose • time available • length • specified addressee • marking criteria • linguistic demands

  5. 1. What’s wrong with an impromptu EAP writing task? • ‘knowledge telling’ rather than ‘knowledge transforming’ • Moore & Morton (2005) found differences between IELTS Task and university essays. The university essay requires • students to transform information from multiple sources • description and summarization • Weir (2016: 6) argues that “reading-into-writing summary task types … represent closely the cognitive processing and knowledge-base requirements for real-life writing activities beyond the test.”

  6. 2. What is an integrated-skills test? • reading writing • listening speaking • listening & reading writing • listening & reading writing & speaking • listening & reading & research writing & speaking

  7. 3. Why use an integrated-skills test? • more authentic (the extent to which the test task replicates a real-life task in the target situation – Davidson, 2009) • has context validity • has cognitive validity • knowledge transforming rather than knowledge telling • focuses on the genre of writing that students actually write e.g. position essay, case study, report, reflection

  8. 3. Why use an integrated-skills test? • better representation of what communication is With its communicative language activities and strategies, the CEFR replaces the traditional model of the four skills (listening, speaking, reading, writing), which has increasingly proved inadequate to capture the complex reality of communication. Moreover, organisation by the four skills does not lend itself to any consideration of purpose or macro-function. The organisation proposed by the CEFR is closer to real-life language use, which is grounded in interaction in which meaning is co-constructed. Activities are presented under four modes of communication: reception, production, interaction and mediation (CEFR Companion Guide, 2018: 30).

  9. 3. Why use an integrated-skills test? • better reflects what and how we teach • improves construct validity • broadens the testing construct  • has positive consequential validity • has positive washback • aligns with Learning-Oriented Assessment • has better predicative validity 

  10. 4. Cognitive processes of a writing task Cognitive processes involved in the design of an integrated writing task (Weir, 2014: 8): • task representation • macro-planning • reading source text • selecting • connecting • organizing • micro-planning • translating • monitoring and revising

  11. 5. Examples from testing bodies • TEAP (Test of English for Academic Purposes): Associated Examinations Board (Weir, 1983) • IELTS, Writing Task 1: Cambridge English • TOEFL IBT • Pearson: PTE Academic • ISE (Integrated Skills in English) Test: Trinity College • TEAP (2016): Eiken Foundation and Sophia University

  12. 6. Examples from institutions • TEEP: Test of English for Educational Purposes (Uni of Reading) • BART-W - the Bedfordshire Academic Reading-into-Writing Test (Uni of Bedfordshire) • Sheffield University: integrated task-based assessment • Sheffield Hallam University: Reading into Writing • Queen Mary University of London: 2,000 word essay/report based on sources  • Arts University Bournemouth: considering it  • Zayed University: 4 integrated tasks

  13. 7. Example from Zayed University (2000) Listening to, and retelling, a lecture • Listen to a 15-17 minute lecture and take notes • Write a ‘retell’ of the lecture based on these notes Reading into Writing • read 3-4 short semi-academic texts and take notes • write a 500-600 word essay based on these notes

  14. 7. Example from Zayed University (2000) Academic Discussion • 10-12 minute academic discussion • in a group • topic that has previously been studied Research Paper • 1200-1500 word research paper • supporting an argument through independent research • defend this research in a 10 minute oral defense

  15. 8. Challenges of using an integrated-skills test • tradition  • resistance • students' expectations  • low face validity • UKVI 4 skills profile  • reporting of results   • diagnosing why a student has failed • reevaluation of testing constructs 

  16. 8. Challenges of using an integrated-skills test • practicality  • time-consuming • varied environmental test conditions • reliability • subjective rating / human raters • standardization • criteria / rubric • plagiarism / cheating  • need for training 

  17. 9. Overcoming these challenges • make students, teachers, administrators aware of the benefits • broaden your testing constructs • If you must have separate listening and reading tests, integrate writing and speaking  • ensure that test specifications are based directly on the curriculum specifications • have clear, unambiguous rating criteria (Chan, Inoue, & Taylor, 2015) • have detailed, descriptive reporting procedures

  18. 9. Overcoming these challenges • have stringent rating procedures (standardisation sessions, double marking, blind marking, monitoring of rater’s performance) • have a small pool of raters • consider using automated essay scoring  • check for plagiarism (SafeAssign, Turnitin) • write your wring prompt / speaking prompt first, then find source texts • adapt source texts if necessary  • train teachers in the new approach

  19. 9. Overcoming these challenges • Weight writing more e.g. Stephen Hughes (Sheffield Hallam University): • Writing (Reading into Writing sit-down assessment): 40% • Speaking (group seminar): 40% • Listening: 10% • Reading: 10%  e.g. William Tweddle (Queen Mary University of London): • Writing 50% (reading-into-writing) • Presentation/Seminar Leadership 20% • Reading 15% • Listening 15%

  20. 9. Overcoming these challenges • make "use of source texts" part of the assessment criteria e.g.: • Uses a sufficient selection of source materials to support ideas • Refers to relevant parts of source texts to support ideas  • Synthesizes information from multiple source texts • Identifies similar information from different source texts (and cites appropriately) • Paraphrases information from the source texts • Uses direct quotations from the source texts appropriately  • Follows in-text citation conventions

  21. 9. Overcoming these challenges • Make your assessment as authentic as possible: • allow plenty of time for students to read the source texts • allow plenty of time for students to complete the task • use multiple source texts to require candidates to synthesize information (Horowitz, 1986) • make the assessment open book • allow students to ask questions about the source texts • allow students to take notes on their laptop • allow students access to internet, spell checker, thesaurus • Grammarly? Automated feedback?

  22. Conclusion Integrated assessment is: • more authentic • more accurately captures what students really do • has better context, cognitive, consequential and predicative validity  • has a major positive washback effect upon teaching and learning • better prepares students for demands of baccalaureate study 

  23. Conclusion • Need to reconsider key assessment principles such as validity, reliability, practicality, test design and control of environmental conditions. • But these challenges are not insurmountable. • The adoption of integrated-skills assessment in EAP has the potential to transform the way we think about EAP assessment. • Step back. Look at the big picture.

  24. References Chan, S. H. C., Inoue, C., & Taylor, L. (2015). Developing rubrics to assess the reading-into-writing skills: A case study. Assessing Writing, 26(1), 20-37. Cumming, A. (2013). Assessing integrated skills. In A. Kunnan (Ed.). The Companion to Language Assessment, Volume 1 (pp. 216-229). Davidson, P. (2009). Authentic assessment in EFL classrooms. In C. Coombe, P. Davidson & D Lloyd (Eds.), The Fundamentals of Language Assessment: A Practical Guide for Teachers, 2nd Edition(pp. 213-224). Dubai: TESOL Arabia. Davidson, P. & Dalton, D. (2003). Multiple-measures assessment: Using 'visas' to assess students' achievement of learning outcomes. In C.A. Coombe & N. Hubley (Eds.), Assessment Practices (pp. 121-134).Virginia: TESOL.

  25. References Davidson, P. & Hobbs, A. (2003). Using academic discussions to assess higher order speaking skills. In S. Phipps (Ed.), Proceedings of the 8thBilkent University School of English Language ELT Conference: Speaking in the Monolingual Classroom (pp. 200-216). Ankara: Bilkent University. Horowitz, D.M. (1986).What professors actually require: Academic tasks for the ESL classroom. TESOL Quarterly, 20, 445‐460. Moore, T., & Morton, J. (2005). Dimensions of difference: A comparison of university writing and IELTS writing. Journal of English for Academic Purposes, 4, 43‐66. Plakans, L. M., & Gebril, A. (2012). A close investigation into source use in L2 integrated writing tasks. Assessing Writing, 17(1), 18-34.

  26. References Shaw, S., & Weir, C.J. (2007). Examining writing: Research and practice in assessing second language writing. Studies in Language Testing, 26. Cambridge: Cambridge University Press and Cambridge ESOL. Weigle, S. C. (2004). Integrating reading and writing in a competency test for non-native speakers of English. Assessing Writing, 9(1), 27–55. Weir, C.J. (1983).The Associated Examining Board’s Test of English for Academic Purposes: An exercise in content validation. In A. Hughes & D. Porter (Eds.). Current Developments in Language Testing,147‐153. London: Academic Press.

  27. References Weir, C.J. (2014). A Research Report on the Development of the Test of English for Academic Purposes (TEAP) Writing Test for Japanese University Entrants. Japan: Eiken Foundation. Yang, H. C., & Plakans, L. M. (2012). Second language writers' strategy use and performance on an integrated reading-listening-writing task. TESOL Quarterly, 46(1), 80-103.

More Related