1 / 18

Performance based analysis for profiling skills and competencies for progress files

Performance based analysis for profiling skills and competencies for progress files . Carol Collins Research Fellow in Teaching and Learning . Introduction. Exploratory item analysis into student performance on computer-based assessment (CBA) tests

marie
Download Presentation

Performance based analysis for profiling skills and competencies for progress files

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Performance based analysis for profiling skills and competencies for progress files Carol Collins Research Fellow in Teaching and Learning

  2. Introduction • Exploratory item analysis into student performance on computer-based assessment (CBA) tests • Do stronger or weaker students perform better or worse on different question types. • Multiple-choice, Multiple Response, Hotspots and Selections • Profiling of students’ performance on CBA tests for PDP portfolios

  3. Example of Multiple Response

  4. Example of Hot Spot

  5. Example of Selection Question

  6. Methodology • 3 Examinations • Two Personal development planning (PDP) • One Scheme Module • Cohort totalling 325 students • Top 20% taken and put into 2 groups for each of the examinations • Question analysis taking: • Average score • Facility • Discrimination

  7. Results • The HOT SPOT questions discriminates the most (1.04 difference in score per question) • The Multiple-choice is second (0.97) • Multiple Response (0.59) • Selection (0.34)

  8. Results Cont: The three following slides show the facility score for the question types

  9. CONCLUSION • Stronger students perform better on hot spots than weaker students • Multiple-choice questions are effective at discriminating between students • Question types that have variable ratios between correct and incorrect responses should be constructed with caution so they are not to easy or too hard • Item analysis can inform practitioner on best ways of improving performance • Students can use feedback on performance to help improve practice

  10. RECOMMENDATIONS • Further investigation into performance of stronger weaker students on item types • Sharing and pooling of resources (test items) • Build calibrated item banks for testing PDP skills and competencies that are interdisciplinary • Practitioner to use data to inform practice • Students to use test transcripts for PDP • Students to use data from item analysis reports for reflective practice and evaluation for PDP portfolio

  11. ContactdetailsCarol Collins • email: carol.collins@luton.ac.uk • Teaching and Learning • University of Luton

More Related