Assessment use argument
This presentation is the property of its rightful owner.
Sponsored Links
1 / 29

Assessment Use Argument PowerPoint PPT Presentation


  • 67 Views
  • Uploaded on
  • Presentation posted in: General

Assessment Use Argument. Nancy Powers Chief of English Testing Section SHAPE, Mons, Belgium Sept 2013. Introduction.

Download Presentation

Assessment Use Argument

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Assessment use argument

Assessment Use Argument

Nancy Powers

Chief of English Testing Section

SHAPE, Mons, Belgium

Sept 2013


Introduction

Introduction

Validity is an integrated evaluative judgment of the degree to which empirical evidence and theoretical rationales support the adequacy and appropriateness of inferences and actions based on test scores” (Messick 1989: 13)


Assessment use argument1

Assessment Use Argument

  • Based on Toulmin’s (2003) approach to practical reasoning

  • Justification

  • Accountability


According to bachman palmer

According to Bachman & Palmer

  • Assessment development consists of two parallel processes that serve two purposes.

  • The assessment production process

  • The assessment justification process

    (p. 430, 2010)


Assessment use argument

  • Therefore…

  • An AUA is a theoretical framework that provides a rationale and set of procedures for justifying the intended uses of the assessment.


The nitty gritty of an aua

The nitty-gritty of an AUA

  • It is comprised of 4 parts

  • 1. Claims

    • The beneficial consequences of an assessment

    • The decisions that are made

    • The interpretations that are made

    • The assessment records

  • 2. Warrant – statements that elaborate the claims


  • The nitty gritty of an aua cont d

    The nitty-gritty of an AUA (cont’d)

    • Not everyone will agree with us

    • 3. Rebuttal – counterclaim

    • 4. Backing – evidence supporting the warrants

    • includes feedback from stakeholders through questionnaires, verbal protocols, observations, interviews, previous research, statistical analyses


    An aua at work

    An AUA at work

    • Lots of theory…

    • Concrete example: Justifying the inclusion of videos in a listening test


    Claim 1 the consequences are beneficial

    Claim 1: The consequences are beneficial

    • I make the claim that

    • The consequences of using a video listening test are beneficial to the test developers and to the students.

    • So, what does this mean? I need to elaborate.


    Warrant

    Warrant

    • The consequences of using the VLT that are specific to the test developers and to the students will be beneficial.

      • The test developers will develop tests that are more authentic and better reflect the TLU domain

      • Students can use the visual cues to help with comprehension

      • The context will be clear thereby reducing student anxiety


    Rebuttals

    Rebuttals

    • I disagree with you!

    • The consequences of using the VLT that are specific to the test developers and to the students will NOT be beneficial.

      • Videos will be distracting

      • Attending to multiple sources of stimulation is more tiring & demanding


    Backing collection of evidence that justify your claims

    Backing: Collection of evidence that justify your claims

    • The students who trialled the test reported that…

      • “The video aspect helped to ground the task, making it more authentic than just an audio test”

      • “It gave focus to me, therefore allowing me to listen. Often, when listening to audio-only, my mind wanders, i.e. I think of something else, therefore missing the listening text.”

      • “They [the videos] were relaxing; therefore there was no mental block to listening because of nervousness.”

  • The use of videos can be theoretically justified in that it introduces construct-relevant variance (Wagner, 2002, 2007)


  • Backing cont d

    Backing cont’d

    • Wagner (2010) found that student performance on a listening test that included videos increased 6.5%

    • If test task characteristics are similar to the TLU characteristics, then the test can be seen as having construct validity (Bachman & Palmer, 1996)


    Claim 2 decisions made

    Claim 2: Decisions made

    • The decisions to award a proficiency level reflect existing educational and societal values and the content/task/accuracy statements as stated in the NATO STANAG 6001 Language Proficiency Levels; and are equitablefor those students who are placed at different proficiency levels. These decisions are made by the test developers and refer to which proficiency level the students belong. The individuals affected by these decisions are the students and the teachers of the MTCP program.


    Assessment use argument

    • Warrant: Values sensitivity

    • Relevant educational values of CDA are carefully considered in the proficiency level decisions that are made.

    • Rebuttal:

    • Relevant educational values of CDA are NOT carefully considered in the proficiency level decisions that are made.


    Assessment use argument

    • Backing:

    • CDA governed by two documents: Qualification Standard and the Foreign National Training Plan

    • VLT respects the C/T/A statements for each proficiency level in STANAG 6001


    Assessment use argument

    • Warrant : Equitability

    • Test takers and teachers are fully informed about how the decision will be made.

    • Rebuttal:

    • Test takers and teachers are NOT fully informed about how the decision will be made.


    Assessment use argument

    • Backing:

    • The testing section conducts information sessions with teachers and testers when introducing new testing methods.

    • Candidate’s Guide


    Claim 3 interpretations

    Claim 3: Interpretations

    • The interpretations about the students’ ability to utilize verbal and non-verbal behaviour to comprehend the main idea, explicitly stated information and implicit information aremeaningfulin terms of the construct definition of listening comprehension,impartial to all groups of test takers, generalizable to tasks that resemble the TLU, and relevant to and sufficient for the proficiency level decisions that are to be made.


    Warrant meaningful

    Warrant : Meaningful

    • The claim is meaningful in terms of listening to and comprehending general English with respect to the construct definition.

    • Rebuttal: The claim is NOT meaningful in terms of listening to and comprehending general English with respect to the construct definition.


    Backing meaningful

    Backing: Meaningful

    • The construct definition is based on research on listening comprehension.

    • The items were developed according to the NATO STANAG 6001 Proficiency levels.

    • Item specs


    Warrant impartiality

    Warrant: Impartiality

    • Test takers are treated impartially during all aspects of the administration of the assessment

    • Rebuttal:

    • Test takers are NOT treated impartially during all aspects of the administration of the assessment


    Backing impartiality

    Backing: Impartiality

    • Candidate’s guide, all sessions administered in the same way every time.

    • All students are given the same test with the same instructions, despite their country of origin, their rank, their gender, etc.

    • Generalizable

    • Relevant

    • Sufficient


    Claim 4 assessment records

    Claim 4 Assessment Records

    • The scores from the video listening test are consistent across different forms and administrations of the test, across students from different military trades, and across groups with different nationalities and first languages.


    Assessment use argument

    • Warrants: Inter/Intra rater reliability

    • Scored the same way across administrations

    • Rebuttal: no rebuttal

    • Backing: this is multiple-choice, computer-delivered test: no inter/intra rater reliability needed

    • internal algorithm in computer program for scoring


    Conclusion in a nutshell

    Conclusion: in a nutshell

    • Basically you are saying something about the test that you have designed

    • You make these claims clear by elaborating on what you mean.

    • Then, you address any perspective that goes against what you have claimed and gather evidence that supports your point of view.


    Assessment use argument

    Questions?

    Thank you


    References

    References

    • Bachman, L. F., & Palmer, A.S. (1996). Language testing in practice. Oxford, Oxford University Press.

    • Bachman, L. F., & Palmer, A.S. (2010). Language assessment in practice. Oxford, Oxford University Press.

    • Hostetter A. B. (2011). When do gestures communicate? A meta-analysis. PsychologicalBulletin, 137(2), 297-315.

    • Kellerman, S. (1990). Lip service: The contribution of the visual modality to speech perception and its relevance to the teaching and testing of foreign language listening comprehension. Applied Linguistics, 11(3), 272-280.

    • Kellerman, S. (1992). “I see what you mean”: The role of kinesic behaviour in listening and implications for foreign and second language learning. Applied Linguistics, 13, 239-258.


    Assessment use argument

    • Okey, G. (2007). Construct implications of including still image or video in computer-based listening tests. Language Testing, 24, 517-537.

    • Toulmin, S. E. (2003). The uses of argument (updated edn). Cambridge: Cambridge University press.

    • Wagner, E. (2002) Video listening tests: A pilot study. Working Papers in TESOL & Applied Linguistics, Teacher’s College, Columbia University, 2 (1). Retrieved from the Internet on August 20, 2007. http://journals.tc-library.org/index.php/tesol/article/viewFile/7/8

    • Wagner, E. (2007). Are they watching? Test-taker viewing behaviour during an L2 video listening test. Language Learning & Technology, 11, 67-86.

    • Wagner, E. (2010b). The effect of the use of video texts on ESL listening test-taker performance. Language Testing, 27, 493-513.


  • Login