1 / 21

Comprehensive Assessment System

Comprehensive Assessment System. Webinar #6 December 14, 2011. Session Topic:. Validity & Reliability. Session Objectives. The purpose of this session is to: Define validity and reliability Distinguish between valid and invalid inferences

dani
Download Presentation

Comprehensive Assessment System

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Comprehensive Assessment System Webinar #6 December 14, 2011

  2. Session Topic: Validity & Reliability

  3. Session Objectives The purpose of this session is to: • Define validity and reliability • Distinguish between valid and invalid inferences • Understand how to apply knowledge of validity and reliability to the selection and development of assessments

  4. Defining Validity Validityrefers to the accuracy of inferences drawn from an assessment. It is the degree to which the assessment measures what it is intended to measure.

  5. Types of Validity Construct validity- the assessment actually measures what it is designed to measure. A actually is A

  6. Types of Validity Concurrent validity- the assessment correlates with other assessments that measure the same construct. A correlates with B

  7. Types of Validity Predictive validity- the assessment predicts performance on a future assessment. A predicts B

  8. Valid Inferences Validity is closely tied to the purpose or use of an assessment. DON’T ASK: “Is this assessment valid?” ASK: “Are the inferences I’m making based on this assessment valid for my purpose?”

  9. Evidence-Centered Design • Validity is about providing strong evidence • Evidence-centered design boosts validity • What do you want to know? • How would you know? • What should the assessment look like?

  10. Defining Reliability • Reliabilityrefers to consistency and repeatability. • A reliable assessment provides a consistent picture of what students know, understand, and are able to do.

  11. Remember! An assessment that is highly reliable is not necessarily valid. However, for an assessment to be valid, it must also be reliable.

  12. Purchasing & Developing Assessments

  13. Considerations • Using what you have • Is it carefully aligned to your purpose? • Purchasing a new assessment • Is it carefully matched to your purpose? • Do you have the funds (for assessment, equipment, training)? • Developing a new assessment • Do you have the in-house content knowledge? • Do you have the in-house assessment knowledge? • Does your team have time for development? • Does your team have the knowledge and time needed for proper scoring?

  14. Improving Validity & Reliability • Ensure questions are based on taught curricula • Ensure questions are based on standards • Allow students to demonstrate knowledge/skills in multiple ways • Ensure a variety of item types (multiple-choice, constructed response) • Ask questions at varying Depth of Knowledge levels • Ensure accurate test administration • Include items that address the full range of standards • Include multiple items that assess the same standard • Review scorer reliability, when necessary

  15. V&R : Student Learning Objectives What makes high-quality evidence for SLOs: • Aligned to the content standards (construct validity) • Being used for the purpose for which it was designed • Administered properly

  16. Ex. SLO Objective: Students will demonstrate grade-level proficiency in reading, writing, and speaking French, including the accurate use of past and present tenses. How would you know if students were proficient in reading, writing, and speaking French?

  17. Ex. SLO Evidence: 1. Written final exam measuring reading comprehension, vocabulary, conjugation/agreement in past and present tenses. 2. 300 word written composition in French, using past and present tense in a familiar content theme. 3. 5-minute conversation on one of 3 pre-selected topics, using past and present tense.

  18. Ex. SLO Administration & Scoring The exam and composition will be part of the written final, administered during the final exam period. • I will score the compositions using the Foreign Language Department level 2 writing rubric, which includes vocabulary, tense, subject-verb agreement, spelling, level of detail, etc. • Approximately 20% of the compositions will also be double-scored by the other French teacher. The oral assessment will be administered one-on-one in the last week of school, prior to the exam period. I will develop the rubric with the other French teacher and have it approved by the Department Chair. I will administer and score most oral exams myself, though I will schedule my Department Chair to sit in on and double-score the first 20%.

  19. Questions?

  20. Upcoming Webinars January 11th 9:30-10:30 Cultural & Linguistic Demands of Assessment

More Related