1 / 54

Moving further

Moving further. - Word counts - Speech error counts - Metaphor counts - Active construction counts. Categorical count data. Hissing Koreans. Winter & Grawunder (2012). No. of Cases. Bentz & Winter (2013). Poisson Model. The Poisson Distribution. few deaths.

jaxon
Download Presentation

Moving further

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Moving further - Word counts - Speech error counts - Metaphor counts - Active construction counts Categorical count data

  2. Hissing Koreans Winter & Grawunder (2012)

  3. No. of Cases Bentz & Winter (2013)

  4. Poisson Model

  5. The Poisson Distribution few deaths Army Corpswith few Horses lowvariability many deaths Army Corpslots of Horses Siméon Poisson high variability 1898: LadislausBortkiewicz

  6. Poisson Regression = generalized linear model with Poisson error structureand log link function

  7. The Poisson Model Y ~ log(b0 + b1*X1 + b2*X2)

  8. In R: lmer(my_counts ~ my_predictors + (1|subject), mydataset, family="poisson")

  9. Poisson model output exponentiate logvalues predicted mean rate

  10. Poisson Model

  11. Moving further - Focus vs. no-focus - Yes vs. No - Dative vs. genitive - Correct vs. incorrect Binary categorical data

  12. Case yes vs. no ~ Percent L2 speakers Bentz & Winter (2013)

  13. Logistic Regression = generalized linear model with binomial error structureand logistic link function

  14. The Logistic Model p(Y) ~ logit-1(b0 + b1*X1 + b2*X2)

  15. In R: lmer(binary_variable ~ my_predictors + (1|subject), mydataset, family="binomial")

  16. Probabilities and Odds Probability of anEvent Odds of anEvent

  17. Intuition about Odds What are the odds that I pick a blue marble? N = 12 Answer: 2/10

  18. Log odds = logit function

  19. Representative values

  20. Snijders & Bosker (1999: 212)

  21. Bentz & Winter (2013)

  22. Case yes vs. no ~ Percent L2 speakers EstimateStd. Error z valuePr(>|z|) (Intercept) 1.4576 0.6831 2.134 0.03286 Percent.L2 -6.5728 2.0335 -3.232 0.00123 Log odds when Percent.L2 = 0

  23. Bentz & Winter (2013)

  24. Case yes vs. no ~ Percent L2 speakers EstimateStd. Error z valuePr(>|z|) (Intercept) 1.4576 0.6831 2.134 0.03286 Percent.L2 -6.5728 2.0335 -3.232 0.00123 For each increase in Percent.L2 by 1%, how much the log odds decrease (= the slope)

  25. Bentz & Winter (2013)

  26. Case yes vs. no ~ Percent L2 speakers EstimateStd. Error z valuePr(>|z|) (Intercept) 1.4576 0.6831 2.134 0.03286 Percent.L2 -6.5728 2.0335 -3.232 0.00123 Exponentiate Odds Logits or“log odds” Proba-bilities Transform byinverse logit

  27. Case yes vs. no ~ Percent L2 speakers EstimateStd. Error z valuePr(>|z|) (Intercept) 1.4576 0.6831 2.134 0.03286 Percent.L2 -6.5728 2.0335 -3.232 0.00123 exp(-6.5728) Odds Logits or“log odds” Proba-bilities Transform byinverse logit

  28. Case yes vs. no ~ Percent L2 speakers EstimateStd. Error z valuePr(>|z|) (Intercept) 1.4576 0.6831 2.134 0.03286 Percent.L2 -6.5728 2.0335 -3.232 0.00123 exp(-6.5728) 0.001397878 Logits or“log odds” Proba-bilities Transform byinverse logit

  29. Odds Numeratormore likely > 1 = event happens more often than not Denominator more likely < 1 = event is more likely not to happen

  30. Case yes vs. no ~ Percent L2 speakers EstimateStd. Error z valuePr(>|z|) (Intercept) 1.4576 0.6831 2.134 0.03286 Percent.L2 -6.5728 2.0335 -3.232 0.00123 exp(-6.5728) 0.001397878 Logits or“log odds” Proba-bilities Transform byinverse logit

  31. Case yes vs. no ~ Percent L2 speakers EstimateStd. Error z valuePr(>|z|) (Intercept) 1.4576 0.6831 2.134 0.03286 Percent.L2 -6.5728 2.0335 -3.232 0.00123 Logits or“log odds” logit.inv(1.4576) 0.81

  32. About 80%(makes sense) Bentz & Winter (2013)

  33. Case yes vs. no ~ Percent L2 speakers EstimateStd. Error z valuePr(>|z|) (Intercept) 1.4576 0.6831 2.134 0.03286 Percent.L2 -6.5728 2.0335 -3.232 0.00123 Logits or“log odds” logit.inv(1.4576) 0.81 logit.inv(1.4576+ -6.5728*0.3) 0.37

  34. Bentz & Winter (2013)

  35. = logit function = inverse logit function

  36. This is the famous “logistic function” logit-1 = inverse logit function

  37. Inverse logit function logit.inv = function(x){exp(x)/(1+exp(x))} (this defines the function in R) (transforms back toprobabilities)

  38. General Linear Model Generalized Linear Model GeneralizedLinearMixed Model

  39. General Linear Model Generalized Linear Model GeneralizedLinearMixed Model

  40. General Linear Model Generalized Linear Model GeneralizedLinearMixed Model

  41. = “Generalizing” the General Linear Model to cases that don’t include continuous response variables (in particular categorical ones) = Consists of two things: (1) an error distribution, (2) a link function Generalized Linear Model

  42. = “Generalizing” the General Linear Model to cases that don’t include continuous response variables (in particular categorical ones) = Consists of two things: (1) an error distribution, (2) a link function Logistic regression: Binomial distribution Poisson regression: Poisson distribution Logistic regression:Logit link function Poisson regression: Log link function

  43. = “Generalizing” the General Linear Model to cases that don’t include continuous response variables (in particular categorical ones) = Consists of two things: (1) an error distribution, (2) a link function Logistic regression: Binomial distribution Poisson regression: Poisson distribution lm(response ~ predictor) glm(response ~ predictor,family="binomial") glm(response ~ predictor,family="poisson") Logistic regression:Logit link function Poisson regression: Log link function

More Related