Loading in 5 sec....

A (poor) Gibbs Sampling Approach to Logistic RegressionPowerPoint Presentation

A (poor) Gibbs Sampling Approach to Logistic Regression

- By
**herb** - Follow User

- 133 Views
- Uploaded on

Download Presentation
## PowerPoint Slideshow about ' A (poor) Gibbs Sampling Approach to Logistic Regression' - herb

**An Image/Link below is provided (as is) to download presentation**

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript

Data

- Simulated based on known values of parameters (one covariate, ‘dose’).
- ‘rats’ given different dosages of imaginary chemical, 4 dose groups with 25 rats in each group.
- Data generated three times under different parameters, three chains used for each data set.

Gibbs Sampling For Logistic Data?

- Traditionally, binomial likelihood, prior on logit.
- Full Conditionals have no coherent form.
- Attractive, however, because it eliminates the need to reject iterations

Algorithm

- Groenewald and Mokgatlhe, 2005
- Create Uniform Latent Variables Based on Y[i,j] = 0, 1
- Draws from joint posterior of Betas and U[i,j]
- pi[i] = p(uniform(01) <= logit-1(Beta*x[i]))

- Written in R, refined in Python
- Very inefficient
- Draw new parameter for each Y[i,j] at each iteration

Implementation

- Three datasets
- Three chains per set
- 1 Million iterations per chain
- Last 500k iterations sent to CODA
- 9m total iterations, 4.5 m analyzed

WinBUGS Model

- Y[i,j]’s given binomial (instead of Bernoulli) likelihood
- Betas regressed on logit of proportion
- Locally uniform priors on beta1 and beta2

WinBUGS Model

model{ for (i in 1:N){ r[i] ~ dbin(p[i], n[i]);logit(p[i]) <- (beta1 + beta2*(x[i] - mean(x[]))); r.hat[i] <- (p[i] * n[i]);} beta1 ~ dflat(); beta2 ~ dflat(); beta1nocenter <- beta1 - beta2*mean(x[]);}

WinBUGS Output: Beta0 (1,0)

WinBUGS Output: Beta0 (1,1)

WinBUGS Output: Beta0 (1,-2)

WinBUGS Wins

- Uses proportions instead of Individual Y[i,j]’s
- Convergence is Better
- WinBUGS appears more precise (more trials needed)
- Also, much faster.

Resources

- Groenewald, Pieter C.N., and Lucky Mokgatlhe. "Bayesian computation for logistic regression.“ Computational Statistics & Data Analysis 48 (2005): 857-68. Science Direct. Elsevier. Web. <http://www.sciencedirect.com/>.
- Professor Cowles

Download Presentation

Connecting to Server..