1 / 12

Reliability

Reliability. Consistent Dependable Replicable Stable . Two Administrations. Test/retest same test - same group Parallel/Alternate forms different versions of test - same group. Two administration procedure:. Administer tests (in two sessions) Convert to z scores (if necessary)

rasul
Download Presentation

Reliability

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Reliability • Consistent • Dependable • Replicable • Stable

  2. Two Administrations • Test/retest same test - same group • Parallel/Alternate forms different versions of test - same group

  3. Two administration procedure: • Administer tests (in two sessions) • Convert to z scores (if necessary) • Correlate (Pearson or Spearman)

  4. Two administration issues: • Problems????? • Duration between???? • Type of variable?????

  5. One Administration • One test • One group • One administration

  6. One administration procedure: • Administer test to one group • Divide questions to score • Split Half • first/second or odd/even halves???? • Correlate scores from halves • Apply Spearman-Brown formula • estimate changes in length

  7. Uses of one administration: • Internal consistency of items • What types of tests to use this on???

  8. Inter item consistency • Statistical estimation • Kuder-Richardson Formula 20 (KR20) = dichotomous questions • Cronbach alpha (alpha coefficient) = all questions • (factor analysis)

  9. Reliability coefficient • 0.00 - 1.00 • higher is better • score is relative, not absolute

  10. Inter-rater reliability • Consensus between raters • Percentage of agreement • Kappa statistic (2 or many raters)

  11. Threats to reliability • Construction • Administration (tester, testee, environment) • Scoring • Interpretation

  12. Project homework • Which approaches were used to determine the reliability of your measure? Why were those measures selected? Could they have used other measures? Why/why not?

More Related