1 / 28

Logistic Regression

Logistic regression is used for classification purposes when the dependent variable is dichotomous. It provides the probability of an effect and evaluates the risk. This approach is preferred over discriminant analysis and multiple regression due to its ability to handle non-linear relationships and probability calculations outside the range of [0,1].

hhilliard
Download Presentation

Logistic Regression

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Logistic Regression

  2. Logistic Regression • When ? • Just like multiple regression, but when the dependent variable is dichotomous. • E.g. improved or not improved; successful or not successful. • Why ? • Logistic regression can be used for classification purpose (it includes c2). • Give probability of an effect (outcome) and evaluate the risk (odds). • Why not performed a discriminant analysis ? • Probability of success outside [0,1] • Normality • Why not performed a multiple regression ? • Probability of success outside [0,1] • Homoscedasticity • Normality

  3. Logistic Regression • Example: • Suppose we want to predict whether someone has a coronary disease (DV) using age in years (IV). • It is customary to code a binary DV either 0 or 1.

  4. Logistic Regression • The logistic curve Linear part Nonlinear part

  5. Logistic Regression • The logistic curve

  6. Logistic Regression • Example: • Suppose we want to predict whether someone has a coronary disease (DV) using age in years (IV). • It is customary to code a binary DV either 0 or 1.

  7. Logistic Regression • The logistic curve • where is the probability of a 1, e is the base of the natural logarithm (about 2.718) and b is the parameters of the model. b adjusts how quickly the probability changes when X increases by a single unit. Because the relationship between X and is nonlinear, b does not have a straightforward interpretation in this model; contrary to ordinary linear regression.

  8. Logistic Regression • (Where did it came from) • Suppose we only know a person's age and we want to predict whether that person has a coronary disease or not. We can talk about the probability of having the disease, or we can talk about the odds of having the disease. Let's say that the probability of not having the disease for a given age is .95. Then the odds of not having the disease is • Now the odds of having the disease would be .05/.95 or 1/19 or 0.0526. This asymmetry is unappealing, because the odds of having the disease should be the opposite of the odds of not having the disease.

  9. Logistic Regression • (Where did it came from) • We can take care of this asymmetry by using the natural logarithm, ln. The natural log of 19 is 2.9444 (ln(0.95/0.05)=2.9444). The natural log of 1/19 is - 2.9444 (ln(0.05/0.95)=-2.9444), so the log odds of having a coronary disease is exactly the opposite of the log odds of not having a disease. In term of odds Solving for In term of probability

  10. Logistic Regression • Finding the regression weights. • In multiple regression, we wanted to minimize the residual sum of squares. • With the logistic curve, there is no mathematical solution that will produce least squares estimates of the parameters. We will use instead the maximum (log) likelihood. • A likelihood is a conditional probability: P( |X), the probability of given X). The idea is to choose the regression weights that will give the maximum (log) likelihood between the data and the logistic curve. Maximum likelihood Maximum log likelihood

  11. Logistic Regression • Finding the regression weights. • The maximum of this expression can then be found numerically using an optimization algorithm

  12. Logistic Regression • Finding the regression weights. • The maximum of this expression can then be found numerically using an optimization algorithm

  13. Logistic Regression • Finding the regression weights. • The maximum of this expression can then be found numerically using an optimization algorithm

  14. Logistic Regression • Hypothesis testing • The idea is to compare the full model with only the constant using chi-square. There is only 1 predictor • This indicates that age can reliably distinguished between people having a coronary disease from those who do not.

  15. Logistic Regression • Hypothesis testing • We can use the same idea to build a regression model. • Also, the Wald statistic can be used (Z test). Fisher information matrix

  16. Logistic Regression • Hypothesis testing • Also, the Wald statistic can be used Constant IV (coronary disease)

  17. Logistic Regression • Explained variability • There are three popular measures that approximate the variance interpretation found in linear regression (R2).

  18. Logistic Regression • Odds Ratio (OR) • The odds ratio is the increase (or decrease) in odds of being in one outcome category when the value of the predictor increases by on unit. • If the odds are the same across groups, then OR=1. • If the odds are greater than 1, then there is an increase probability of being classify into the category. • If the odds are smaller than 1, then there is a decrease probability of being classify into the given category. • Thus, at each of my birthdays I increase my odds of having a coronary disease by 1.12. In other words, each year I increase the risk of developing a coronary disease by 12 percents.

  19. Logistic Regression • Odds Ratio (OR) • For a 5 year age difference, say, the increase is exp(b)5 [= 1.117315] = 1.74, or a 74% increase. • Classification table • Cutoff = 0.5 Constant only All predictors Total correct percentage = 57 Total correct percentage = 74

  20. Logistic Regression • Prediction • If I have (x’=)50 years old, what is my probability of having a coronary disease ?

  21. Logistic Regression • Confidence intervals • CI=0.95

  22. Logistic Regression • Confidence bands • CI=0.95

  23. Logistic Regression • Recoding a continuous variable into a dichotomous variable • Cutoff at 55 • Contingency table

  24. Logistic Regression • Recoding a continuous variable into a dichotomous variable • Cutoff at 55 • Regression weights • Wald test

  25. Logistic Regression • Recoding a continuous variable into a dichotomous variable • Cutoff at 55 • Explained variability

  26. Logistic Regression • Recoding a continuous variable into a dichotomous variable • Cutoff at 55 • Classification table Total correct percentage = 57 Total correct percentage = 72

  27. Logistic Regression • Recoding a continuous variable into a dichotomous variable • Cutoff at 55 • Odds ratio • If I am 55 years old and up, I have 8 times more chances to have a coronary disease.

  28. Logistic Regression • Recoding a continuous variable into a dichotomous variable • Cutoff at 55 • Confidence intervals • The CI (0.95) is asymmetric. It suggests that coronary disease is 2.9 to 22.9 more likely to occur if I am 55 yrs and up.

More Related