1 / 12

Assessment of Learning: Test Design and Administration Factors That Affect Student Performance

Assessment of Learning: Test Design and Administration Factors That Affect Student Performance. Rita Czaja and Scottie Barty Northern Kentucky University Highland Heights, KY. Research Motivation. New AACSB accreditation standards Unexpected results on department’s self-assessment test.

alka
Download Presentation

Assessment of Learning: Test Design and Administration Factors That Affect Student Performance

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessment of Learning: Test Design and Administration Factors That Affect Student Performance Rita Czaja and Scottie Barty Northern Kentucky University Highland Heights, KY

  2. Research Motivation • New AACSB accreditation standards • Unexpected results on department’s self-assessment test

  3. Table 1: Test Results for Juniors and Seniors

  4. Interpreting Percentile Ranks • Percentile ranks • May be better than the percent correct for comparisons across test sections • If some sections have more difficult questions • Performance is imprecise measure of difficulty • Reflects design of test questions and students’ abilities

  5. Evaluation of Test Questions • Rule for effective multiple-choice questions • Scored the design of each question: • 1 = Simple (Straightforward computation; basic terms; single topic) • 2 = Moderate (One less familiar term; two-step logic process; some extraneous data) • 3 = Complex (Problem set-up not obvious; unusual situation; complex formula, definition, or rule; multiple topics; much extraneous data)

  6. Hypotheses H1: Seniors’ average percent correct is negatively related to the difficulty of a question’s design. H2: Seniors’ average percent correct is positively related to the extent of repetition. H3: Seniors’ average percent correct is negatively related to the difficulty of recalling a question’s content.

  7. Descriptive Statistics • Question-Design • Financial: 2.37 average • Managerial: 1.91 average • Repetition (number and percent of questions covered in principles and intermediate) • Financial: 15 (39%) • Managerial: 16 (70%) • Ease-of-Recall • Financial: 2.03 average • Managerial: 1.74 average

  8. Table 3: Regression ResultsSenior’s Percent Correct = a + b1 Design + b2 Repetition + b3 Ease of Recall Residual + b4 Test Section

  9. Lessons Learned • Be very clear about what you want to measure. This decision drives • the timing of testing • the nature of questions used • the level of performance that can reasonably be expected

  10. Table 1: Test Results for Juniors and Seniors

  11. Table 2: Panel A – Distribution Statistics

  12. Table 2: Panel B – Correlations

More Related