Logistic Regression. Chapter 8. Aims. When and Why do we Use Logistic Regression ? Binary Multinomial Theory Behind Logistic Regression Assessing the Model Assessing predictors Things that can go Wrong Interpreting Logistic Regression. When And Why.
Related searches for Logistic Regression
Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.
Click First, then Change. See p 279
Identify any categorical Covariates (Predictors).
With a categorical predictor with more than 2 categories you should use either the highest number to code your control category, then select last for your indicator contrast. In this data set 1 is cured, 0 not cured (our control category, therefore we select first as control, see p 279.
You can specify main effects and interactions.
Highlight both predictors, then click the >a*b>
If you don’t have previous literature, choose Stepwise Forward LR
LR is Likelihood Ratio
Hosmer-Lemeshow assesses how well the model fits the data.
Look for outliers
+/- 2 SD
Request the 95% CI for the odds ratio (odds of Y occurring)
Initially the model will always select the option with the highest frequency, in this case it selects the intervention (treated).
Large values for -2 Log Likelihood (-2 LL) indicate a poor fitting model. The -2 LL will get smaller as the fit improves.
Using the constant only the model above predicts a 57% probability of Y occurring.
See p 288 for an
Example of using equation to compute
We can say that the odds of a patient who is treated being cured are 3.41 times higher than those of a patient who is not treated, with a 95% CI of 1.561 to 7.480.
The important thing about this confidence interval is that it doesn’t cross 1 (both values are greater than 1). This is important because values greater than 1 mean that as the predictor variable(s) increase, so do the odds of (in this case) being cured. Values less than 1 mean the opposite: as the predictor increases, the odds of being cured decreases.
Removing Intervention from the model would have a significant effect on the predictive ability of the model, in other words, it would be very bad to remove it.
Further away from .5 is better.
The .5 line represents a coin toss you have a 50/50 chance.
If the model fits the data, then the histogram should show all of the cases for which the event has occurred on the right hand side (C), and all the cases for which the event hasn’t occurred on the left hand side (N).
This model is better at predicting cured cases than it is for non cured cases, as the non cured cases are closer to the .5 line.
Use the Case Summaries function to create a table of the first 15 cases showing the values of Cured, Intervention, Duration, the predicted probability (PRE_1) and the predicted group membership (PGR_1).