Statistics 101: Large-Scale Assessment

Statistics 101: Large-Scale Assessment PowerPoint PPT Presentation


  • 102 Views
  • Uploaded on
  • Presentation posted in: General

Download Presentation

Statistics 101: Large-Scale Assessment

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


1. Standards-Based Assessments: Understanding the Data David Chayer, Vice President of Psychometrics

2. Agenda Goals and Objectives Opportunities for Questions My Background Walk-through of the Reports

3. Goals and Objectives Introduce Idaho to Data Recognition Corporation Present the format and content of the DRC-generated score reports Increase public awareness of the technical issues in measurement

4. Opportunities for Questions Walkthrough of the presentation Audio will be on mute except for the presenter Questions may be submitted via the chat room at any time Start your question with your district name If necessary, your district can ask direct questions during the Question and Answer period. This presentation will be web-posted

5. My Background Ten years in norm-referenced testing Six years in licensure/certification Computer-based and computer-adaptive testing Eight years in large-scale assessment at Data Recognition Corporation

6. A quick tour of the Idaho score reports Individual student reports School rosters District rosters

7. Questions from the previous Web-ex Training Session These reports do not contain: Percentile rank scores Lexile scores The fall/winter assessment will contain: Growth scores Current targets will be valid for this year Scale scores by standard DRC will be conducting a focus group where stakeholders in the state of Idaho can provide input on the future direction of the fall/winter assessment

8. Individual Student Reports

13. Student Roster

15. District Proficiency Level Summary

16. Using computer simulations to explain error bands Student level School level

17. Using computer simulations to explain error bands Student level (not an ISAT example!) E.g., Proficient cutpoint at a raw score of 44 Total test: maximum of 70 points Standard 1: maximum of 10 points What are the expected range of scores for: A raw score of 44? 34? 54?

18. Using computer simulations to explain error bands - A raw score of 44 - 6 out of a possible 10

19. Using computer simulations to explain error bands - A raw score of 34 - 4 out of 10 possible

20. Using computer simulations to explain error bands - A raw score of 54 - 7 out of a possible 10

21. Using Computer Simulations (cont.) Compare/contrast error bands associated with students and with groups of students Classroom, school, district, and state One example with 10 students A second with 100 students

22. Using Computer Simulations (cont.) 10 students • 44 to 47 • 6 to 7

23. Using Computer Simulations (cont.) 100 students • 44 to 45 • 7 to 7

24. Score Reports Individual Student Report Class Level Reports Class Student Roster School Level Reports School Student Roster School Proficiency Level Summary (by subgroups) District Level Reports District Proficiency Level Summary (by subgroups) District Report of School Proficiency School Graduation Test Summary (by subgroups)

25. Delivery of Score Reports Score Reports will be available June 20, 2007. District Test Coordinators will receive an email with information on how to access the score reports. Reports are PDFs; no paper versions Districts will receive ISRs for home school students. These students are not included in any summary reports.

26. Questions Please use the chat box on the right side of your computer screen to ask a question. We can unmute your phone line, if we need further clarification on your question.

28. Appendix: Psychometrics in Brief

29. How “number correct” scores translate into scale scores Item calibration Converting p-values to Rasch difficulties Equating How gains/losses are calculated across years Scaling How the final reporting scale is established

30. Item calibration Converting p-values to Rasch difficulties Raw score differences are not created equal Rasch item difficulties are created equal

31. Equating How we establish growth or decline in state performance across years P-values vs. Rasch item difficulties

32. P-values: Across years for common items

33. Scaling How the final reporting scale is established Using z-scores to convert to a more easily interpreted score metric Set average at 0 in the baseline year Set standard deviation at 1 in the baseline year Convert to any linear metric using a slope and intercept (SCOREx-MEANx)/SDx = (SCOREy-MEANy)/Sdy

  • Login