1 / 47

South Carolina Alternate Assessment (SC-Alt) Advisory Committee

South Carolina Alternate Assessment (SC-Alt) Advisory Committee. September 28, 2011. South Carolina Alternate Assessment (SC-Alt) Overview. Advisory Committee Role and Purpose. American Institutes for Research (AIR) SC-Alt Contractor. A.I.R. Staff. DeeAnn Wagner Project Director

jayden
Download Presentation

South Carolina Alternate Assessment (SC-Alt) Advisory Committee

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. South CarolinaAlternate Assessment (SC-Alt)Advisory Committee September 28, 2011

  2. South CarolinaAlternate Assessment (SC-Alt)Overview

  3. Advisory CommitteeRole and Purpose

  4. American Institutes for Research (AIR)SC-Alt Contractor

  5. A.I.R. Staff • DeeAnn WagnerProject Director • Jennifer Chou Project Manager • Lynnett Wright Alternate Assessment Specialist • Emma Hannon Research Assistant

  6. 2011 Administration

  7. Discussion of New Procedures • Print Manipulatives • Packaging of Manipulatives and Test Booklets • Answer Folders and Security Affidavits

  8. Biology Operational Administration2011

  9. Scoring Fidelity

  10. Videotaping of SC-Alt Administrations • Implemented to monitor test administration effectiveness and scoring consistency • Annual implementation allows monitoring consistency over testing years as adjustments are made in training, as new tasks/items are used to replace previous tasks, and new content areas are added (i.e., high school biology).

  11. Videotaping Sampling Procedures for 2011 One student per sampled teacher was videotaped: • ELA only for elementary and middle school forms • ELA and biology for high school forms

  12. Videotaping Sampling Procedures • All districts were sampled. • Sampling implemented by teacher and student. • Teachers sampled according to proportions of students in their district. • Approximately 1/3 of teachers and 10% of students were sampled.

  13. Review and Analysisof Videotaped Administrations • All recordings were reviewed by trained AIR raters for: • Fidelity of administration • Accuracy of scoring • Teacher score is used for reporting purposes 10% sample reviewed by AIR alternate assessment specialist

  14. Videotaping Results • Previous results have indicated consistently high rates of scoring agreement at all three form levels (elementary, middle, and high school). • For 2011, the average item agreement statistics for the ELA videotaped samples were: Elementary form - 96.0, Middle School form – 94.9, and High School form – 95.9. The item agreement statistic for High School biology was 94.3. • These results are consistent with the scoring consistency results for previous years and confirm a high level of scoring consistency for the new High School biology assessment.

  15. Second Rater Pilot Study2011 Administration

  16. Second Rater Pilot • A second rater procedure may also be used to obtain data on scorer fidelity. • A pilot of the second rater procedure was conducted for the 2011 administration. • Participation in the pilot was voluntary. • We are seeking feedback and suggestions from you today as we review the outcomes of the pilot study.

  17. Second Rater Pilot Procedures • The DTC-Alt volunteered district participation. • The second rater pilot was limited to elementary ELA administrations. • The DTC-Alt was allowed to select a teacher (and the specified student) identified for videotaping for implementing the pilot second rater session. • The second rater procedure was in lieu of videotaping for that teacher. • The second rater could also serve as the test administration monitor.

  18. Second Rater Pilot Procedures • The participating district was only required to implement the procedure with one teacher. • The second rater scored the student’s responses on a separate answer document marked Second Rater and submitted to AIR separately. • Second rater pilot participants (teachers, second raters, and DTCs-Alt) were asked to complete a brief questionnaires.

  19. Second Rater Qualifications • Must meet the test administrator criteria: • certified teacher • administrator (e.g., school administrator, district level special education consultant, or other administrator) • related services personnel • Must participate in test administration training.

  20. Second Rater Pilot StudyOutcomes

  21. Second Rater Scoring Consistency Results • The second rater observer sores were compared to the teacher scores to calculate scoring agreement in the same manner as was used for videotape data. • Since both the second rater and videotaping procedures were used for samples of Elementary ELA administrations, the results of use of the two methods could be compared. • Analyzable data were obtained for 48 second rater administrations and 70 videotaped administrations.

  22. Second Rater Scoring Consistency Results • The second rater average item agreement statistics were 96.9% for the second rater data and 96.0% for the videotape data. • The comparable results for these two procedures supports the effectiveness of the second rater procedure.

  23. Second Rater Pilot Questionnaire Results • The questionnaires completed by the pilot participants were used to obtain information on the experience of the teachers and observers, the staff positions of the observers, and the recommendations and preferences in regard to the second rater procedure from the three groups of respondents (teachers, observers, and DTCs-Alt). • Survey responses were obtained for 41 teachers, 45 observers, and 8 DTCs-Alt. • Since the number of districts participating in the pilot was 25, the participation rate for DTCs-Alt was low.

  24. Second Rater Pilot Questionnaire Results • The survey respondents reported a very high level of preference for using the second rater procedure over use of the videotaping procedure. • 93% of the teachers and 87% of the observers responded that they preferred the second rater procedure. • 5% and 9% of each group respectively indicated no preference, with only 2% and 4% indicating a preference for videotaping. • These results did not differ by teacher or observer experience or observer staff position.

  25. Second Rater Pilot Questionnaire Results • 75% of the DTCs-Alt reported a preference for the second rater procedure over the videotaping procedure (6 of the 8 respondents). • 25% (2 DTCs-Alt) indicated a preference for videotaping.

  26. Questionnaire Results:Problems Encountered • The students using eye-gaze response were difficult to rate (observe). • A few districts reported some planning issues, e.g., determining who should be the second rater. • Being a pilot, all materials were sent to the DTC-Alt.

  27. Questionnaire Results:Reported Benefits • The second rater was able to observe some administration problems related to teacher preparation. • Teachers reported the procedure to be less stressful than videotaping. • Teachers reported the procedure was less distracting to students than videotaping.

  28. Questionnaire Results:Suggestions to Improve the Process • Provide a test booklet to the second rater. • Identify second rater teachers prior to TA training. • Include documentation of mode of response on the second rater answer folder. • Other: second rater materials, packaging, and return procedures

  29. 2011 Student Participation and Performance

  30. Alternate Assessment Participation 2006 – 2011* *PACT-Alt/HSAP-Alt 2006; SC-Alt 2007-2011

  31. Changes in Rates of Participation • The overall number of students increased by 233, which was a 7.9% increase. Last year’s increase was 6.6% • Autism students increased by14.0%, compared to an 18.3% increase last year. • Mild MD students increased by 17.0%, compared to a 7.4% increaselast year. • The percentages of increase from 2007 to 2011 have been 72% for Autism and 55% for Mild MD, compared to an increase since 2007 of 7% for all other students.

  32. SC-Alt 2011 Participation by Primary Disability

  33. ELA Performance for All Students2007 - 2011

  34. Math Performance for All Students2007 - 2011

  35. Science Performance for All Students2007 - 2011

  36. Social Studies Performance for All Students in 2008 - 2011

  37. Increases in SC-Alt Participation:A Continuing Concern • The SC-Alt was designed for students with the “most significant cognitive disabilities.” It was not intended for higher-level autism or even higher –level moderate MD students. Only a very small number of students classified as “mild MD” would be expected to be included. • There is evidence that some districts and schools are gaming the accountability system by identifying SC-Alt students inappropriately.

  38. 2011 SC-Alt Students with Previous PASS Scores • A search for previous PASS scores was implemented for the 2009 and 2010 PASS data files. • 2010 PASS scores were identified for 203 students. • 2009 PASS scores were identified for an additional 142 students. • These numbers approximate the student increases from 2009 to 2010 and from 2010-2011 administrations.

  39. PASS Test Grade Prior to SC-Alt Placement

  40. Score ReportsSupporting Student LearningParent Suggestions

  41. 2012 Administration

  42. Discussion and Feedback • District Level Training • Scoring Worksheets • Other

  43. Office of Exceptional ChildrenUpdate

  44. Common Core State StandardsAssessment Consortia

  45. Peer Review

  46. AA-MASUpdate on Projects

  47. http://www.ed.sc.gov • Suzanne Swaffield • Douglas Alexander sswaffie@ed.sc.gov 803-734-8274 dgalexan@ed.sc.gov 803-734-3923

More Related