1 / 16

Bayesian Linear Regression

Bayesian Linear Regression In regression analysis, we look at the conditional distribution of the response variable at different levels of a predictor variable. Bayesian Linear Regression. Response variable - Also called “dependent” or “outcome” variable - What we want to explain or predict

Download Presentation

Bayesian Linear Regression

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Bayesian Linear Regression In regression analysis, we look at the conditional distribution of the response variable at different levels of a predictor variable

  2. Bayesian Linear Regression • Response variable • - Also called “dependent” or “outcome” variable • - What we want to explain or predict • - In simple linear regression, response variable is continuous • Predictor variables • Also called “independent” variables or “covariates” • In simple linear regression, predictor variable usually is also continuous • How we define which variable is response and which is predictor depends on our research question

  3. Quick review of linear functions Y is a response variable that is a linear function of the predictor variable β0: intercept; the value of Y when X=0 β1: slope; how much Y changes when X increases by 1 unit

  4. Intro to Bayesian simple linear regression Likelihood Prior distribution The posterior distribution is not straighforward We have to implement MCMC techniques with WinBUGS

  5. Examples Willingness to Pay for Environmental Improvements in a Large City. For example, we can study the social benefits of a set of environmental and urban improvements planned for the waterfront of the City of Valencia (Spain): Response variable: How much are you willing to pay for this policy? Covariates: Sex, Age, Income Data: 80 individuals

  6. Examples

  7. Discrete choice experiment Random Utility Model Probit Model Logit Model

  8. Objectives: • Revealed preference models use random utilities • Probit models assume that utilities are multivariate normal • Probit MCMC generates latent, random utilities • Logit models assume that the random utilities have extreme value distributions. • Logit MCMC uses the Hasting-Metropolis algorith

  9. Random Utility Model Utility for alternative m is: where there are n subjects (sample), M+1 alternatives in the choice set, and Ji choice ocassiones for subject i

  10. Subject picks alternative k if for all m The probability of selecting k is for all m Statistical Models: {εi,j,m} are Normal  Probit Model {εi,j,m} are Extreme Value  Logit Model

  11. Logit model in WinBUGS Likelihood: Prior distributions:

  12. Logit model in WinBUGS Example: Discrete choice experiment To study the value that car consumers place upon environmental concerns when purchasing a car Response variable: Yes/No Attributes: safety (Yes/No), carbon dioxide emissions, acceleration from 0 to 100 km/h(<10sec. And < 7.5 sec)2.5 sec), second hand, and annual cost (900€, 1400€, 2000€). Sample size: 150

  13. Logit model in WinBUGS

  14. Probit model in WinBUGS Likelihood: Prior distributions:

  15. Probit model in WinBUGS

  16. Hierarchical Logit The hierarchical logistic regression model is a very easy extension of standard logit. Likelihood yij ~ Bernoulli(pij), logit(pij) <- b1j + b2jx2ij + … bkjxkij Priors bjk ~ N(Bjk, Tk) for all j,k Bjk <- k1 + k2 zj2 + … + km zjm qr ~ N(0, .001) for all q,r Tk ~ Gamma(.01, .01)

More Related