1 / 21

Interactive graphics Understanding OLS regression

Interactive graphics Understanding OLS regression Normal approximation to the Binomial distribution. General Stats Software. example: OLS regression example: Poisson regression as well as specialized software. Specialized software. Testing: Classical test theory – ITEMIN

stacie
Download Presentation

Interactive graphics Understanding OLS regression

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Interactive graphics Understanding OLS regression Normal approximation to the Binomial distribution

  2. General Stats Software example: OLS regression example: Poisson regression as well as specialized software

  3. Specialized software Testing: • Classical test theory – ITEMIN • Item response theory • BILOG-MG • PARSCALE • MULTILOG • TESTFACT

  4. Specialized software Structural equation modeling (SEM)

  5. Specialized software Hierarchical linear modeling (HLM)

  6. Open data

  7. Run simple linear regression

  8. Analyze  Regression  Linear

  9. Enter the DV and IV

  10. Check for confidence intervals

  11. Output Age accounts for about 37.9% of the variability in Gesell score The regression model is significant, F(1,19) = 13.202, p = .002 The regression equation: Y’=109.874-1.127X Age is a significant predictor, t(9)=-3.633, p=.002. As age in months at first word increases by 1 month, the Gesell score is estimated to decrease by about 1.127 points (95% CI: -1.776, -.478)

  12. Click to execute Enter the data Fit a Poisson loglinear model: log(Y/pop) =  + 1(Fredericia) + 2(Horsens) + 3(Kolding) + 4(Age)

  13. G2 = 46.45, df = 19, p < .01 City doesn’t seem to be a significant predictor, whereas Age does.

  14. Plot of the observed vs. fitted values--obviously model not fit

  15. Fit another Poisson model: log(Y/pop) = +1(Fredericia) + 2(Horsens) + 3(Kolding) + 4(Age) + 5(Age)2 Both (Age) and (Age)2 are significant predictors.

  16. Plot of the observed vs. fitted values: model fits better

  17. Fit a third Poisson model (simpler): log(Y/pop) = + 1(Fredericia) + 2(Age) + 3(Age)2 All three predictors are significant.

  18. Plot of the observed vs. fitted values: much simpler model

  19. Item Response Theory Person Ability Easy item Hard item Item Difficulty Low ability person: easy item - 50% chance

  20. Item Response Theory Person Ability Easy item Hard item Item Difficulty Low ability person: moderately difficult item - 10% chance High ability person, moderately difficult item 90% chance

  21. Item Response Theory 100% - 50% - Probability of success Item difficulty/ Person ability 0% - -3 -2 -1 0 1 2 3 Item

More Related