1 / 31

RELTA in Practice

RELTA in Practice. Overview. Test administration Scoring processes - section scores and weighting - rating processes Generation of results Test maintenance: reliability Compromises: theory vs practice. Recap: Structure. Speaking. Listening. Recap: Delivery.

lester-shaw
Download Presentation

RELTA in Practice

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. RELTAinPractice

  2. Overview • Test administration • Scoring processes • - section scores and weighting • - rating processes • Generation of results • Test maintenance: reliability • Compromises: theory vs practice

  3. Recap: Structure Speaking Listening

  4. Recap: Delivery

  5. Administrationrequirements • On-site delivery • Separation of delivery and rating processes • Quality control: score reliability, training, data collation and storage • Minimise workload and human error • Maximise test security

  6. Administration: Roles • Administrators - test centre • Examiners = interlocutors - test centre • Raters x2 (or 3) – anywhere (local or RMIT) • RMIT Operations

  7. RELTA Examining RELTA Examining RELTA Rating RELTA Rating Issuing of results Issuing of results Administration Local administration Local examining RMIT delivery Test centre roles RELTA Examining RELTA Rating RMIT roles Issuing of results

  8. RELTA administered Speech files rated (x2 raters) Speech files uploaded to server Speech file dispatched for independent rating Rater 1 and Rater 2 scores combined and checked Administration Test versions dispatched Results issued: Test Centre Regulator Candidates

  9. Administration: Processes • Register candidates • Administer RELTA Listening – in groups • Listening tests marked – database • Examiner delivers RELTA Speaking • Speaking performance recorded • Uploaded to RMIT server • Raters (1 and 2) login to database and enter scores • Rater scores compared for reliability – 3rd rater • Results calculated • Certificates generated

  10. Administration: SRMS • Scoring Record Management System (SRMS): • Developed to manage administration: • Distributes candidate speaking files to raters • Captures and rater scores • Allows for remote and independent rating • Identifies rater discrepancies • Generates results • Produces certificates • Stores candidate details and speaking files centrally • Online access for: • Administrators (registering of candidates, accessing results) • Raters (enter scores for RELTA speaking)

  11. Scoring

  12. Scoring: Speaking

  13. EnterscoresforSections 1, 2 and 3 Listen to candidate sound file Downloadratingscripts Scoring: Rater 1 interface

  14. EnterscoresforSections 1, 2 and 3 Listen to candidate sound file Downloadratingscripts Scoring: Rater 2 interface

  15. Admin interface Imports scores from R1 and R2 Imports Listening score Identifies rater discrepancies Calculates overall scores Determines ICAO Level

  16. Data collation _______ _______

  17. Rating discrepancies

  18. SRMS: Rating discrepancy

  19. Inter-rater reliability

  20. Inter-rater reliability

  21. Rater re-accreditation

  22. Rater accreditation

  23. Examiner accreditation

  24. Manuals

  25. Summary Testing for ICAO compliance requires: Test instruments that are well designed and can be implemented effectively

More Related