1 / 42

Developing Descriptors Brian North www.eurocentres.com ; www.eaquals.org bjnorth@eurocentres.com ;

Developing Descriptors Brian North www.eurocentres.com ; www.eaquals.org bjnorth@eurocentres.com ; bnorth@eaquals.org. Stages in Developing Descriptors. 1. Conceptualisation Clarifying the construct. What are we describing? Collecting relevant example, systems

kareem
Download Presentation

Developing Descriptors Brian North www.eurocentres.com ; www.eaquals.org bjnorth@eurocentres.com ;

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Developing Descriptors • Brian North • www.eurocentres.com; www.eaquals.org • bjnorth@eurocentres.com; bnorth@eaquals.org

  2. Stages in Developing Descriptors 1. Conceptualisation • Clarifying the construct. What are we describing? • Collecting relevant example, systems • Deciding categories in the descriptive scheme • Clarifying key questions 2. Construction • Creating the descriptor pool • Editing, drafting – filling gaps in the description 3. Validation • Qualitative: through iterative workshops with teachers • Quantitative: through IRT scaling of use in assessment 4. Interpretation • Set thresholds between levels • Summarise developing proficiency

  3. Stages in Developing Descriptors 1. Conceptualisation • Clarifying the construct. What are we describing? • Collecting relevant example, systems • Deciding categories in the descriptive scheme • Clarifying key questions 2. Construction • Creating the descriptor pool • Editing, drafting – filling gaps in the description 3. Validation • Qualitative: through iterative workshops with teachers • Quantitative: through IRT scaling of use in assessment 4. Interpretation • Set thresholds between levels • Summarise developing proficiency

  4. CEFR • CEFR descriptors: observable, functional outcomes “competence” descriptors also mainly observable proficiency • Interaction (BICS) / Production (CALP) • Illustrative videos of 16-18 yr olds: difficulty with BICS “C2” • LoS more complex than modern languages • Language aspects / non-language aspects • Discourse emphasis: genres; cognitive skills • Developmental – linked to cognitive growth Far less known about LoS than modern languages • 20 years experience with descriptors 1975-1995 • 20 years developing descriptive scheme 1975-1995 Conceptualisation

  5. We/you know it involves more: • specific, formal, abstract • explicit, detailed, conventionalised (= expectations) • cohesive and structured (e.g. sequencing) • coherent (goal-oriented) • planning, self-monitoring, internal feedback, editing • rhetorical skills and structures, strategies • BUT • How much is really known about academic discourse? • Reception of exposition by the teacher • Interaction in class • Production by the teacher • To what extent are skills transversal – a common core? Characteristics of LoS or (C) ALP

  6. Vollmer • Pooling expertise and materials • Corpus of curricula and examination papers • Classroom observation and research • Interviews with teachers US “No Child Left Behind” (Bailey & Butler) • Analyse and align existing content and language standards • Need for observation and research on Teacher Talk • Need for empirical analysis of performance of mother tongue and second language students • Interviews with teachers (expectations: Recep; Production) • Analyse textbooks Need for Collaboration & Research

  7. CEFR – Preparatory Work • Clarify concept: 1975 (Threshold) – 1992 (Proposal) • Experience with descriptors (BN: 1983-93) • Classroom discourse analysis (BN: 1984-9) • Involvement of stakeholders (Working Party 1992-6) CEFR – Project Design • Analyse and align existing systems • Interactive definition of categories with Authoring Group • Swiss National Research Project • Involvement of teachers in qualitative validation - Workshops Need for Collaboration & Research

  8. Stages in Developing Descriptors 1. Conceptualisation • Clarifying the construct. What are we describing? • Collecting relevant example, systems • Deciding categories in the descriptive scheme • Clarifying key questions 2. Construction • Creating the descriptor pool • Editing, drafting – filling gaps in the description 3. Validation • Qualitative: through iterative workshops with teachers • Quantitative: through IRT scaling of use in assessment 4. Interpretation • Set thresholds between levels • Summarise developing proficiency

  9. Relationship to: • content standards • European Qualifications Framework • CEFR • Categories to be described • Transversal categories as in Table 5 of ERDLE proposal (p52) • Subcategories of Recep, Inter, Prod, Interp, Evaluation, Mediation? • Cognitive skills & strategies from Situation analysis (Beacco et al) • What else? • Style • concrete-salient features (CEFR-style) / abstract • Length – including assumptions “Can make a complaint”: B1 • broad-holistic / atomistic-analytic / both (Fleming) Key Questions I

  10. Thresholds to be described • expected language proficiency levels • types of discourse • stages of cognitive development • strategies • How to deal with “difficult parts” (non-language) e.g. Bildung • consideration of others • critical thinking, sound judgement and courage to express it? • flexibility in thinking and argumentation Key Questions II

  11. Stages in Developing Descriptors 1. Conceptualisation • Clarifying the construct. What are we describing? • Collecting relevant example, systems • Deciding categories in the descriptive scheme • Clarifying key questions 2. Construction • Creating the descriptor pool • Editing, drafting – filling gaps in the description 3. Validation • Qualitative: through iterative workshops with teachers • Quantitative: through IRT scaling of use in assessment 4. Interpretation • Set thresholds between levels • Summarise developing proficiency

  12. Creating a classified bank of descriptors: • Collate / deconstruct all source systems • Eliminate doubles, redundancy • Identify gaps • Editing and drafting • Confirm style • Harmonise use of verbs (not done in CEFR English!) • Harmonise formulations • Create variations (for missing levels) • Author missing categories • Organisation • Classify with serial numbers • Translation to key languages / check translations with plurilinguals Construction

  13. Stages in Developing Descriptors 1. Conceptualisation • Clarifying the construct. What are we describing? • Collecting relevant example, systems • Deciding categories in the descriptive scheme • Clarifying key questions 2. Construction • Creating the descriptor pool • Editing, drafting – filling gaps in the description 3. Validation • Qualitative: through iterative workshops with teachers • Quantitative: through IRT scaling of use in assessment 4. Interpretation • Set thresholds between levels • Set Summarise developing proficiency

  14. Analysis of teachers discussing proficiency: • Video of two learners • Who is better? Why? Justify your choice • “Repertory grid” analysis of categories teachers use to compare quality • Sorting descriptors into categories • Pile of (maximum 60) descriptors • Set of (maximum 4) envelopes labelled with the relevant categories • Discard envelope • Tick ones that are clear, relevant and useful • Sorting descriptors into levels • Pile of (maximum 15) descriptors for same category • Set of (CEFR 6) envelopes labelled with levels • Discard envelope / Tick ones that are clear, relevant and useful Qualitative Validation

  15. To construct a scale from the descriptors for the “core construct” • To bolt onto / link to this scale sets of descriptors for categories that prove to be less core areas • To find out/confirm what level specific descriptors are • To discover which descriptors do not work • To confirm communality of the interpretation of the descriptors across: • Languages • Regions / countries / systems • Educational sectors Quantitative Validation - Purpose

  16. Identify good/best descriptors from the pool after the qualitative validation • Confirm the supposed “level” of these descriptors • Create a set of overlapping checklists of c50 descriptors (like ELP checklists); each checklist targeted at a “level” • Define a rating scale: Yes/No; 0-4 for the descriptors • Identify classes at approximately the right level for each checklist • Arrange teacher assessment and/or self-assesment with the checklists • Collect minimum 150 examples of each checklist • IRT Rasch Model “Rating Scale Analysis” to build scale • Eliminate descriptors with 80%+ or 20%- (Rasch problem) Quantitative Validation = Steps

  17. Anchor Design: CEFR (North 2000)

  18. Recommended Design (after De Jong) Data Collection:

  19. Vertical Scale of Descriptors

  20. Vertical Scale of Descriptors

  21. Extending the Core Scale I

  22. Quantitative Validation - Prerequisites • Construct is well-defined – common understanding of what is being described/rated/scaled • Descriptors are well-formulated, clear and relevant • Teachers/learners are capable of making judgements about the areas concerned • There is a solid anchor design in the data collection

  23. Stages in Developing Descriptors 1. Conceptualisation • Clarifying the construct. What are we describing? • Collecting relevant example, systems • Deciding categories in the descriptive scheme • Clarifying key questions 2. Construction • Creating the descriptor pool • Editing, drafting – filling gaps in the description 3. Validation • Qualitative: through iterative workshops with teachers • Quantitative: through IRT scaling of use in assessment 4. Interpretation • Set thresholds between levels • Summarise developing proficiency

  24. Setting Thresholds between Levels • Marking out equal intervals on the scale • Identifying „jumps“ in content described, gaps between clusters of descriptors • Comparing to original scale author intention • Comparing to Waystage, Threshold, Eurocentres, Cambridge exam levels • Fine-tuning for equal intervals • Checking for consistency, coherence

  25. The majority of descriptors stating social functions: • greet people, ask how they are and react to news • handle very short social exchanges • discuss what to do, where to go and make arrangements Descriptors on getting out and about: • make simple transactions in shops, banks etc. • get simple information about travel and services CEFR 3.6 Salient Characteristics A2

  26. Maintain interaction and get across what you want to: • give or seek personal views and opinions • express the main point comprehensibly • keep going comprehensibly, even though pausing evident, especially in longer stretches • Cope flexibly with problems in everyday life: • deal with most situations likely to arise when travelling • enter unprepared into conversations on familiar topics CEFR 3.6 Salient Characteristics B1

  27. Effective argument: • account for and sustain opinions in discussion by providing relevant explanations and arguments • explain a viewpoint on a topical issue giving the advantages and disadvantages of various options Holding your own in social discourse: • interact with a degree of fluency and spontaneity that makes regular interaction with native speakers possible • adjust to changes of direction, style and emphasis A new degree of language awareness: • make a note of "favourite mistakes" and monitor speech for them CEFR 3.6 Salient Characteristics B2

  28. Stages in Developing Descriptors 1. Conceptualisation • Clarifying the construct. What are we describing? • Collecting relevant example, systems • Deciding categories in the descriptive scheme • Clarifying key questions 2. Construction • Creating the descriptor pool • Editing, drafting – filling gaps in the description 3. Validation • Qualitative: through iterative workshops with teachers • Quantitative: through IRT scaling of use in assessment 4. Interpretation • Set thresholds between levels • Summarise developing proficiency

  29. United States • “No Child Left Behind” • 2001-7 Appendix

  30. States have a legal duty to provide the support to ensure that every child is proficient in the academic language they need to be successful at school. • Must test this. • Must be at least a grade above and a grade below proficient. (not just the usual US master / non-master) US “No Child Left Behind” 2001-7

  31. No overall framework or common reference points • Testing-led: dozens of consortia • No time for research • No systematic definition of the construct ALP • Confusion with “English Language Arts” (= creative writing for native speakers) or • Elaborated from language used in subject content standards • No definition of “proficient:” 15 significantly different interpretations • Some states 3, some 4, some 5 grades; all different names, numbers, concepts =CHAOS US “No Child Left Behind” 2001-7

  32. Vollmer • Pooling expertise and materials • Corpus of curricula and examination papers • Classroom observation and research • Interviews with teachers US “No Child Left Behind” (Bailey & Butler) • Analyse and align existing content and language standards • Need for observation and research on Teacher Talk • Need for empirical analysis of performance of mother tongue and second language students • Interviews with teachers (expectations: Recep; Production) • Analyse textbooks Need for Collaboration & Research

  33. US “No Child Left Behind” (Bailey & Butler) • Analyse and align existing content and language standards • Need for observation and research on Teacher Talk • Need for empirical analysis of performance of mother tongue and second language students • Interviews with teachers (expectations: Recep; Production) • Analyse textbooks US Experience

  34. Analysing & Aligning Standards • Assumptions in Subject Standards: • Elementary School: observe, analyse, compare, describe, record • Middle School: identify, recognise, compose, explain • High School: recognise, describe, explain (Bailey and Butler 2003) • “Extracting the language features embedded in the content standards presented significant challenges …. • Bailey, Butler and Sato (2005) have been successful developing standards-standards linkages that involve both language and content standards BUT “procedures to establish such linkages … remain to this day in their infancy (Chaloub-Deville 2008)

  35. US “No Child Left Behind” (Bailey & Butler) • Analyse and align existing content and language standards • Need for observation and research on Teacher Talk • Need for empirical analysis of performance of mother tongue and second language students • Interviews with teachers (expectations: Recep; Production) • Analyse textbooks US Experience

  36. Classroom Research From 2001: • Analysis of functions in science classrooms • Teachers • Students • Repair strategies (Bailey and Butler 2003) • BUT • All tests produced before any research results were available – even in consortia aware of the problem (Chaloub-Deville 2008

  37. US “No Child Left Behind” (Bailey & Butler) • Analyse and align existing content and language standards • Need for observation and research on Teacher Talk • Need for empirical analysis of performance of mother tongue and second language students • Interviews with teachers (expectations: Recep; Production) • Analyse textbooks US Experience

  38. Teacher Expectations • Students must learn acceptable ways of presenting information to the teacher – not usually explicitly taught • Very little study • “Teachers are rarely explicitly aware of their language expectations” • Dropped the idea of teacher interviews because “anecdotal” unreliable information (Bailey and Butler 2003)

  39. US “No Child Left Behind” (Bailey & Butler) • Analyse and align existing content and language standards • Need for observation and research on Teacher Talk • Need for empirical analysis of performance of mother tongue and second language students • Interviews with teachers (expectations: Recep; Production) • Textbooks US Experience

  40. No overall framework or common reference points • Testing-led: dozens of consortia • No time for research • No systematic definition of the construct ALP • Confusion with “English Language Arts” (= creative writing for native speakers) or • Elaborated from language used in subject content standards • No definition of “proficient:” 15 significantly different interpretations • Some states 3, some 4, some 5 grades; all different names, numbers, concepts =CHAOS US “No Child Left Behind” 2001-7

  41. Stages in Developing Descriptors 1. Conceptualisation • Clarifying the construct. What are we describing? • Collecting relevant example, systems • Deciding categories in the descriptive scheme • Clarifying key questions 2. Construction • Creating the descriptor pool • Editing, drafting – filling gaps in the description 3. Validation • Qualitative: through iterative workshops with teachers • Quantitative: through IRT scaling of use in assessment 4. Interpretation • Set thresholds between levels • Summarise developing proficiency

More Related