1 / 69

Simple Linear Regression

Simple Linear Regression. AMS 572 11/29/2010. Outline. Brief History and Motivation – Zhen Gong Simple Linear Regression Model – Wenxiang Liu Ordinary Least Squares Method – Ziyan Lou Goodness of Fit of LS Line – Yixing Feng OLS Example – Lingbin Jin

gil
Download Presentation

Simple Linear Regression

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Simple Linear Regression AMS 572 11/29/2010

  2. Outline • Brief History and Motivation – Zhen Gong • Simple Linear Regression Model – Wenxiang Liu • Ordinary Least Squares Method – Ziyan Lou • Goodness of Fit of LS Line – YixingFeng • OLS Example – Lingbin Jin • Statistical Inference on Parameters – Letan Lin • Statistical Inference Example – Emily Vo • Regression Diagnostics– Yang Liu • Correlation Analysis – Andrew Candela • Implementation in SAS – Joseph Chisari

  3. Brief History and Introduction Legendre published the earliest form of regression, which was the method of least squares in 1805. The method was extended by Francis Galton in the 19th century to describe a biological phenomenon. Karl Pearson and Udny Yule extended it to a more general statistical context around 20th century. In 1809, Gauss published the same method.

  4. Motivation for Regression Analysis ? Prediction forresponse variable • Regression analysis is a statistical methodology to estimate the relationship of a response variable to a set of predictor variable. • When there is just one predictor variable, we will use simple linear regression. When there are two or more predictor variables, we use multiple linear regression New observed predictor value Predict Y, based on X

  5. Motivation for Regression Analysis 2010 Milan: Horsepower at 6000 rpm: 175 Highway gasoline consumption: 0.0326 gallon per mile 2010 Fusion: Horsepower at 6000 rpm: 263 Highway gasoline consumption: ? 2010 Camry: Horsepower at 6000 rpm: 169 Highway gasoline consumption: 0.03125 gallon per mile Responsevariable (Y): Highway gasoline consumption Predictorvariable (X): Horsepower at 6000 rpm

  6. Simple Linear Regression Model • A summary of the relationship between a dependent variable (or response variable) Y and an independent variable (or covariate variable) X. • Y is assumed to be a random variable while, even if X is a random variable, we condition on it (assume it is fixed). Essentially, we are interested in knowing the behavior of Y given we know X = x.

  7. Good Model • Regression models attempt to minimize the distance measured vertically between the observation point and the model line (or curve). • The length of the line segment is called residual, modeling error, or simply error. • The negative and positive errors should cancel out ⇒ Zero overall error Many lines will satisfy this criterion.

  8. Good Model

  9. Probabilistic Model • In simple linear regression, the population regression line was given by E(Y) = β0+β1x • The actual values of Y are assumed to be the sum of the mean value, E(Y), and a random error term, ∊: Y = E(Y) + ∊ = β0+β1x + ∊ • At any given value of x, the dependent variable Y ~ N (β0+β1x , σ2)

  10. Least Squares (LS) Fit Boiling Point of Water in the Alps

  11. Least Squares (LS) Fit Find a line that represent the ”best” linear relationship:

  12. Least Squares (LS) Fit • Problem: the data does not go through a line • Find the line that minimizes the sum: • We are looking for the line that minimizes

  13. Least Squares (LS) Fit • To get the parameters that make the sum of square difference become minimum, take partial derivative for each parameter and equate it with zero.

  14. Solve the equations and we get Least Squares (LS) Fit

  15. To simplify, we introduce Least Squares (LS) Fit • The resulting equation is known as the least squares line, which is an estimate of the true regression line.

  16. Goodness of Fit of the LS Line The fitted values is The residuals are used to evaluate the goodness of fit of the LS Line.

  17. The error sum of squares SSE= The total sum of squares SST= The regression sum of squares SST=SSR+SSE Goodness of Fit of the LS Line

  18. Goodness of Fit of the LS Line • The coefficient of determination is always between 0 and 1 • The sample correlation coefficient between X and Y is For the simple linear regression,

  19. Estimation of the variance The variance measures the scatter of the around their means An unbiased estimate of is given by This estimate of has n-2 degrees of freedom.

  20. Implementing OLS method to Problem 10.4 OLS method: The time between eruptions of Old Faithful geyser in Yellowstone National Park is random but is related to the duration of the last eruption. The table below shows these times for 21 consecutive eruptions.

  21. Implementing OLS method to Problem 10.4 A scatter plot of Next vs. LAST

  22. Implementing OLS method to Problem 10.4

  23. Implementing OLS method to Problem 10.4 When x=3, y=60 We could say that Last is a good predictor of Next

  24. Statistical Inference Statistical Inference on and • Final Result • and are normally distributed. .

  25. Statistical Inference on and • Derivation . Set ’s as fixed and use

  26. Statistical Inference on and • Derivation .

  27. Statistical Inference on and • Derivation .

  28. Statistical Inference on and Since • Pivotal Quantities (P.Q.): . • Confidence Intervals (CI’s): .

  29. Statistical Inference on and • Hypothesis tests: . Reject at level if • A useful application is to show whether there is a linear relationship between x and y Reject at level if • One-side alternative hypotheses can be tested using one-side t-test.

  30. Analysis of Variance (ANOVA) • Mean Square: • A sum of squares divided by its degrees of freedom.

  31. Analysis of Variance (ANOVA) • ANOVA Table:

  32. Statistical Inference Example – Testing for Linear Relationship • Problem 10.4 At α = 0.05, is there a linear trend between the time to the NEXT eruption and the duration of the LAST eruption? vs. Reject H0 if where

  33. Statistical Inference – Hypothesis Testing Solution: We reject H0 and therefore conclude That there is a linear relationship between NEXT and LAST.

  34. Statistical Inference Example - Confidence and Prediction Intervals • Problem 10.11 from Tamane & Dunlop Statistics and Data Analysis 10.11 (a) Calculate a 95% PI for the time to the next eruption if the last eruption lasted 3 minutes.

  35. Problem 10.11 – Prediction Interval Solution: The formula for a 100(1-α)% PI for a future observation is given by

  36. Problem 10.11 - Prediction Interval

  37. Problem 10.11 - Confidence Interval 10.11(b) Calculate a 95% CI for the mean time to the next eruption for a last eruption lasting 3 minutes. Compare this confidence interval with the PI obtained in (a)

  38. Problem 10.11 - Confidence Interval Solution: The formula for a 100(1-α)% CI for is given by where The 95% CI is [57.510, 63.257] The CI is shorter than the PI

  39. Regression Diagnostics • Checking the Model Assumptions • is a linear function of • is the same for all • The errors are normally distributed • The errors are independent(for time series data) • Checking for Outliers and Influential Observations

  40. Checking the Model Assumptions • Residuals: • can be viewed as the “estimates” of random errors

  41. Checking for Linearity • If regression of on is linear, then the plot of vs. should exhibit random scatter around zero

  42. y Checking for Linearity Tire Wear Data x

  43. Checking for Linearity Tire Wear Data Residual x

  44. Checking for Linearity • Data Transformation

  45. Checking for Constant Variance • If the constant variance assumption is correct, the dispersion of the is approximately constant with respect to the

  46. Checking for Constant Variance Example from textbook 10.21 Residual e

  47. Checking for Normality • We can use residuals to make a normal plot Example from textbook 10.21 Normal plot of residuals

  48. Checking for Outliers • Definition: An outlier is an observation that does not follow the general pattern of the relationship between and • A large residual indicates an outlier!!

  49. Checking for Influential Observations • An observation can be influential because it has an extreme x-value, an y-value, or both • A large indicates an influential observation!! k: # of predictors

  50. Checking for Influential Observations

More Related