1 / 15

Chapter 13

Chapter 13. Simple Linear Regression Analysis. Simple Linear Regression Analysis. 13.1 The Simple Linear Regression Model and the Least Square Point Estimates 13.2 Model Assumptions and the Standard Error 13.3 Testing the Significance of Slope and y-Intercept

Download Presentation

Chapter 13

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Chapter 13 Simple Linear Regression Analysis

  2. Simple Linear Regression Analysis 13.1 The Simple Linear Regression Model and the Least Square Point Estimates 13.2 Model Assumptions and the Standard Error 13.3 Testing the Significance of Slope and y-Intercept 13.4 Confidence and Prediction Intervals 13.5 Simple Coefficients of Determination and Correlation

  3. Simple Linear Regression Analysis Continued 13.6 Testing the Significance of the Population Correlation Coefficient (Optional) 13.7 An F Test for the Model 13.8 The QHIC Case 13.9 Residual Analysis (Optional) 13.10 Some Shortcut Formulas (Optional)

  4. LO 1: Explain the simple linear regression model. 13.1 The Simple Linear Regression Model and the Least Squares Point Estimates • The dependent (or response) variable is the variable we wish to understand or predict • The independent (or predictor) variable is the variable we will use to understand or predict the dependent variable • Regression analysis is a statistical technique that uses observed data to relate the dependent variable to one or more independent variables • The objective is to build a regression model that can describe, predict and control the dependent variable based on the independent variable

  5. LO 2: Find the least squares point estimates of the slope and y-intercept. The Least Squares Point Estimates • Estimation/prediction equationŷ = b0 + b1x • Least squares point estimate of the slope β1 • Least squares point estimate of the y-intercept 0

  6. LO 3: Describe the assumptions behind simple linear regression and calculate the standard error. 13.2 Model Assumptions and the Standard Error • Mean of ZeroAt any given value of x, the population of potential error term values has a mean equal to zero • Constant Variance AssumptionAt any given value of x, the population of potential error term values has a variance that does not depend on the value of x • Normality AssumptionAt any given value of x, the population of potential error term values has a normal distribution • Independence AssumptionAny one value of the error term ε is statistically independent of any other value of ε

  7. LO3 Sum of Squares • Sum of squared errors • Mean square error • Point estimate of the residual variance σ2 • Standard error • Point estimate of residual standard deviation σ

  8. LO 4: Test the significance of the slope and y-intercept. 13.3 Testing the Significance of the Slope and y-Intercept • A regression model is not likely to be useful unless there is a significant relationship between x and y • To test significance, we use the null hypothesis:H0: β1 = 0 • Versus the alternative hypothesis:Ha: β1 ≠ 0

  9. LO3 Testing the Significance of the Slope #2

  10. LO 5: Calculate and interpret a confidence interval for a mean value and a prediction interval for an individual value. 13.4 Confidence and Prediction Intervals • The point on the regression line corresponding to a particular value of x0 of the independent variable x is ŷ = b0 + b1x0 • It is unlikely that this value will equal the mean value of y when x equals x0 • Therefore, we need to place bounds on how far the predicted value might be from the actual value • We can do this by calculating a confidence interval mean for the value of y and a prediction interval for an individual value of y

  11. LO 6: Calculate and interpret the simple coefficients of determination and correlation. 13.5 Simple Coefficient ofDetermination and Correlation • How useful is a particular regression model? • One measure of usefulness is the simple coefficient of determination • It is represented by the symbol r2 This section may be read anytime after reading Section 13.1

  12. LO 7: Test hypotheses about the population correlation coefficient (optional). 13.6 Testing the Significance of the Population Correlation Coefficient (Optional) • The simple correlation coefficient (r) measures the linear relationship between the observed values of x and y from the sample • The population correlation coefficient (ρ) measures the linear relationship between all possible combinations of observed values of x and y • r is an estimate of ρ

  13. LO 8: Test the significance of a simple linear regression model by using an F test. 13.7 An F Test for Model • For simple regression, this is another way to test the null hypothesisH0: β1 = 0 • This is the only test we will use for multiple regression • The F test tests the significance of the overall regression relationship between x and y

  14. LO 9: Use residual analysis to check the assumptions of simple linear regression (optional). 13.9 Residual Analysis (Optional) • Checks of regression assumptions are performed by analyzing the regression residuals • Residuals (e) are defined as the difference between the observed value of y and the predicted value of y, e = y - ŷ • Note that e is the point estimate of ε • If regression assumptions valid, the population of potential error terms will be normally distributed with mean zero and variance σ2 • Different error terms will be statistically independent

  15. 13.10 Some Shortcut Formulas (Optional) where

More Related