demystifying second language assessment n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Demystifying Second Language Assessment PowerPoint Presentation
Download Presentation
Demystifying Second Language Assessment

Loading in 2 Seconds...

play fullscreen
1 / 28

Demystifying Second Language Assessment - PowerPoint PPT Presentation


  • 213 Views
  • Uploaded on

Demystifying Second Language Assessment. Shawna Williams BC TEAL Annual Conference May 24, 2014. Agenda. Introduction Assessment terminology – definitions Principles of Assessment. My assessment history…. Language Assessment: Principles and Classroom Practices. Assessment Terminology.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Demystifying Second Language Assessment' - amanda-hatfield


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
demystifying second language assessment

Demystifying Second Language Assessment

Shawna Williams

BC TEAL Annual Conference

May 24, 2014

agenda
Agenda
  • Introduction
  • Assessment terminology – definitions
  • Principles of Assessment
assessment terminology1
Assessment Terminology

Assessment ≠ Testing

assessment is
Assessment is…
  • “appraising or estimating the level or magnitude of some attribute of a person.”

(Mousavi, 2009)

  • “an ongoing process of collecting information about a given object of interest according to procedures that are systematic and substantively grounded.”

(Brown & Abeywickrama, 2010)

assessment
Assessment
  • “A good teacher never ceases to assess students, whether those assessments are incidental or intended.”

(Brown & Abeywickrama, 2010)

assessment and learning
Assessment and Learning

Teaching

Assessment

Measurement

Tests

Evaluation

(Brown & Abeywickrama, 2010, p. 6)

function of assessment
Function of Assessment

Informal

Formal

  • Incidental, unplanned comments
  • Coaching
  • Impromptu feedback
  •  on homework
  • Nonjudgmental
  • Systematic and planned
  • Give T and Ss appraisal of achievement
  • Tests and assignments
function of assessment1
Function of Assessment

Formative

Summative

  • “evaluating students in the process of ‘forming’ their competencies and skills with the goal of helping them to continue that growth process”
  • feedback on performance
  • future continuation of learning
  • “aims to measure, or summarize, what the student has grasped”
  • end of course or unit
  • looking back and taking stock

(Brown & Abeywickrama, 2010, p.7)

five principles of assessment
Five Principles of Assessment

How do you know if an assessment task is effective, appropriate, useful, or . . . “good”?

practicality
Practicality

Reliability

Validity

Authenticity

Washback

a practical test1
A PRACTICAL TEST . . .
  • budgetary limits
  • appropriate time constraints
  • clear directions for administration
  • appropriately utilizes human resources
  • does not exceed available material resources
  • considers time and effort for design and scoring

(Brown & Abeywickrama, 2010, p. 26)

a reliable test1
A RELIABLE TEST . . .
  • consistent across two or more administrations
  • clear directions for scoring/evaluation
  • uniform rubrics for scoring/evaluation
  • consistent application of rubrics by scorer
  • unambiguous to the test-taker

(Brown & Abeywickrama, 2010, p. 27)

reliability con t
Reliability con’t
  • Rater Reliability
    • Inter-Rater Reliability
    • Intra-Rater Reliability
reliability con t1
Reliability con’t
  • Rater Reliability
    • Inter-Rater Reliability
    • Intra-Rater Reliability
  • Student-Related Reliability
  • Test Administration Reliability
  • Test Reliability
a valid test1
A VALID TEST . . .
  • measures exactly what it proposes to measure
  • does not measure irrelevant or “contaminating” variables
  • relies on empirical evidence (performance)
  • performance that samples the test’s criterion (objective)
  • useful, meaningful information about test-taker’s ability
  • supported by theoretical rationale or argument

(Brown& Abeywickrama, 2010, p. 30)

validity con t
Validity con’t
  • Content-Related Evidence
  • Criterion-Related Evidence
  • Construct-Related Evidence
  • Consequential Validity (Impact)
  • Face Validity
an authentic test1
AN AUTHENTIC TEST . . .
  • language as natural as possible
  • items are contextualized rather than isolated
  • meaningful, relevant, interesting topics
  • thematic organization
  • real-world tasks

(Brown & Abeywickrama, 2010, p. 37)

a test that provides beneficial washback1
A TEST THAT PROVIDES BENEFICIAL WASHBACK . . .
  • positively influences teachers’ teaching
  • positively influences learners’ learning
  • learners can adequately prepare
  • feedback for language development
  • more formative than summative
  • conditions for peak performance

(Brown & Abeywickrama, 2010, p. 38)

applying principles to creation of assessment tools
Applying Principles to Creation of Assessment Tools
  • Practical test procedures?
  • Test is reliable?
  • Rater reliability?
  • Content validity?
  • Impact has been accounted for?
  • Procedure is “biased for best”?
  • Test tasks are authentic?
  • Test offers beneficial washback?
  • See Brown & Abeywickrama, Chpt. 2
questions or comments
Questions or Comments?

Shawna Williams

swilliams@listn.info