slide1 n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
A Statistical Investigation of the Effects of Computer Disruptions on Student and School Scores PowerPoint Presentation
Download Presentation
A Statistical Investigation of the Effects of Computer Disruptions on Student and School Scores

Loading in 2 Seconds...

play fullscreen
1 / 17

A Statistical Investigation of the Effects of Computer Disruptions on Student and School Scores - PowerPoint PPT Presentation


  • 66 Views
  • Uploaded on

A Statistical Investigation of the Effects of Computer Disruptions on Student and School Scores. June 25, 2014. Computer Disruptions. In 2013, several States experienced computer problems during student testing. Minnesota’s Comprehensive Assessment

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

A Statistical Investigation of the Effects of Computer Disruptions on Student and School Scores


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
slide1

A Statistical Investigation of the Effects of Computer Disruptions on Student and School Scores

June 25, 2014

computer disruptions
Computer Disruptions
  • In 2013, several States experienced computer problems during student testing.
    • Minnesota’s Comprehensive Assessment
    • Kentucky's high school end-of-course exams
    • Indiana’s ISTEP exam
    • Oklahoma Student Testing Program
  • Thousands of students were effected by wide-spread computer disruptions
    • Kicked off the test
    • Long load times
  • States were concerned that these disruptions could detrimentally impact student scores
research question
Research Question
  • Minnesota and Oklahoma:
    • Commissioned a study to examine the impact of computer disruptions on student scores.
    • The type of disruption was different, but we used the same statistical approach.
  • “On average, would student scores have been different if computer disruption did not occur?”
  • Things to consider:
    • Minnesota allowed students the opportunity to review items or retake the test, but did not keep record of who took advantage of this opportunity.
    • Oklahoma allowed students that were kicked off to retake the test.
defining disruption
Defining Disruption
  • Oklahoma: Computer issues occurred on April 28 and April 30
    • Students kicked off of the test in the middle of the exam.
    • The Vendor had record of every student that was kicked off.
  • Minnesota: Computer issues occurred on April 16 and 23
    • Long item latencies – the amount of time it take an online page to load
    • Abnormal restarts – students unexpectedly logged out and required to log back in to resume test
    • Administrative Pauses – test paused by teacher or test proctor
    • No clear record of who was or was not interrupted
research approach
Research Approach
  • Basic approach:
    • Based on the premise that student scores are consistent over time.
    • Match disrupted students to similarly performing non-disrupted students.
      • Using Propensity Matching (Prior year scores, Gender, Ethnicity, FRP Lunch, LEP, School-level FRP Lunch, School-level achievement)
    • Examine differences in 2013 test scores between the two groups.
  • Student-level Statistical Investigation:
    • Are there mean differences in scores between the disrupted and non-disrupted groups?
    • Does disruption add to the prediction of the 2013 tests scores?
    • Are there differences in prediction between the disrupted and non-disrupted groups?
    • Would disrupted students have done differently under normal testing conditions?
    • Compare the distribution of differences between the disrupted and non-disrupted samples.
student level statistical investigation
Student-level Statistical Investigation

1. Are there mean differences in scores between the disrupted and non-disrupted groups?

student level statistical investigation1
Student-level Statistical Investigation

2. Does disruption add to the prediction of the 2013 tests scores?

student level statistical investigation2
Student-level Statistical Investigation

3. Are there differences in prediction between the disrupted and non-disrupted groups?

student level statistical investigation3
Student-level Statistical Investigation

4. Would disrupted students have done differently under normal testing conditions?

    • For each disrupted student, apply the non-disrupted prediction equation and compute the predicted score.
    • Take the difference between the predicted score and observed score (score obtained under the disrupted conditions)
    • For, the non-disrupted sample, computed the difference between predicted scores and observed scores
  • Large numbers of students with notable differences between observed and predicted scores provides another piece of evidence about the impact of the computer disruptions.
student level statistical investigation4
Student-level Statistical Investigation

5. Compare the distribution of differences between the disrupted and non-disrupted samples.

  • Determine the difference in observed and predicted scores at the 5th, 10th, 90th and 95th percentile for the non-disrupted group
  • Apply these cuts to the disrupted group and determine what percent of students fall at or above the 90th and 95th cuts and what percent of students that fall at or below the 5th and 10th cuts.
  • Larger than 5 and 10 percent of the sample, would provide evidence that the computer disruption may have impacted scores.
student level statistical investigation5
Student-level Statistical Investigation

Percent of Disrupted Students with Predicted and Observed Score Differences at the 5th, 10th, 90th and 95th Percentile of Non-Disrupted Students

school level score differences
School-Level Score Differences
  • School-level Statistical Investigation:
    • Would school-level scores differ if disrupted students were excluded?
    • Does disruption add to the prediction of 2013 school-level means?
    • Would there be differences in % proficient if predicted scores were used in place of observed scores?
school level score differences1
School-Level Score Differences

Would school-level scores differ if disrupted students were excluded?

  • All students included (baseline)
  • Students in disrupted sample excluded
    • School-level means increased slightly (d= -.02 to .22)
  • Students that test on April 16 or 23 excluded
    • School-level means dropped on average by .09 theta points (d = .01-.26)
  • Predicted score are used in place of the observed score for students that were in the disrupted sample
    • School-level means for some grades increased and for other grades decreased (d = .002 - .09)
school level score differences2
School-Level Score Differences

Does disruption add to the prediction of 2013 school-level means?

  • Multiple Regression:
    • 2012 school-level means
    • Percent of students disrupted
    • Interaction between 2012 school-level mean and % of students disrupted
  • Results:
    • % of students disrupted and the interaction term were not significant predictors
    • ΔR² was small
school level score differences3
School-Level Score Differences

Would there be differences in % proficient if predicted scores were used in place of observed scores?

conclusions
Conclusions

Student-level Summary:

“The evidence shows that there were disruption effects; however, the effects on students’ scores were neither widespread nor large. In addition, the evidence shows that disruptions were not consistently detrimental, but at times were beneficial.”

School-Level Summary:

“School-level analyses suggest that there may be a small impact on school-level means for schools that experienced disruption, but the direction of the impact is not consistent and adjusting school-level scores based on this data would be beneficial for some schools and detrimental to others.”