1 / 38

IS 310 Business Statistics CSU Long Beach

IS 310 Business Statistics CSU Long Beach. Simple Linear Regression. Often, two variables are related. Examples: Amount of advertising expenses and amount of sales. Daily temperature and daily water consumption. Undergraduate GPA and starting salary of graduates.

karik
Download Presentation

IS 310 Business Statistics CSU Long Beach

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. IS 310 Business Statistics CSU Long Beach

  2. Simple Linear Regression Often, two variables are related. Examples: Amount of advertising expenses and amount of sales. Daily temperature and daily water consumption. Undergraduate GPA and starting salary of graduates. Weight of automobiles and miles per gallon.

  3. Simple Linear Regression When two variables are related, one can be used to predict the value of the other. Examples: Knowing the amount of advertising expenses, one can predict the amount of sales. Knowing the daily temperature, one can predict the amount of water consumption.

  4. Simple Linear Regression • Managerial decisions often are based on the relationship between two or more variables. • Regression analysis can be used to develop an equation showing how the variables are related. • The variable being predicted is called the dependent variable and is denoted by y. • The variables being used to predict the value of the dependent variable are called the independent variables and are denoted by x.

  5. Simple Linear Regression • Simple linear regression involves one independent variable and one dependent variable. • The relationship between the two variables is approximated by a straight line. • Regression analysis involving two or more independent variables is called multiple regression.

  6. Simple Linear Regression Model • The equation that describes how y is related to x and • an error term is called the regression model. • The simple linear regression model is: y = b0 + b1x +e where: • b0 and b1 are called parameters of the model, • e is a random variable called the error term.

  7. Simple Linear Regression Equation • The simple linear regression equation is: E(y) = 0 + 1x • Graph of the regression equation is a straight line. • b0 is the y intercept of the regression line. • b1 is the slope of the regression line. • E(y) is the expected value of y for a given x value.

  8. E(y) x Simple Linear Regression Equation • Positive Linear Relationship Regression line Intercept b0 Slope b1 is positive

  9. E(y) x Simple Linear Regression Equation • Negative Linear Relationship Regression line Intercept b0 Slope b1 is negative

  10. E(y) x Simple Linear Regression Equation • No Relationship Regression line Intercept b0 Slope b1 is 0

  11. is the estimated value of y for a given x value. Estimated Simple Linear Regression Equation • The estimated simple linear regression equation • The graph is called the estimated regression line. • b0 is the y intercept of the line. • b1 is the slope of the line.

  12. Sample Data: x y x1 y1 . . . . xnyn Estimated Regression Equation Sample Statistics b0, b1 Estimation Process Regression Model y = b0 + b1x +e Regression Equation E(y) = b0 + b1x Unknown Parameters b0, b1 b0 and b1 provide estimates of b0 and b1

  13. ^ yi = estimated value of the dependent variable for the ith observation Least Squares Method • Least Squares Criterion where: yi = observed value of the dependent variable for the ith observation

  14. _ _ x = mean value for independent variable y = mean value for dependent variable Least Squares Method • Slope for the Estimated Regression Equation where: xi = value of independent variable for ith observation yi = value of dependent variable for ith observation

  15. Least Squares Method • y-Intercept for the Estimated Regression Equation

  16. Simple Linear Regression • Example: Reed Auto Sales Reed Auto periodically has a special week-long sale. As part of the advertising campaign Reed runs one or more television commercials during the weekend preceding the sale. Data from a sample of 5 previous sales are shown on the next slide.

  17. Simple Linear Regression • Example: Reed Auto Sales Number of TV Ads (x) Number of Cars Sold (y) 1 3 2 1 3 14 24 18 17 27 Sx = 10 Sy = 100

  18. Estimated Regression Equation • Slope for the Estimated Regression Equation • y-Intercept for the Estimated Regression Equation • Estimated Regression Equation

  19. Scatter Diagram and Trend Line

  20. Sample Problem Problem # 4 (10-Page 553; 11-Page 570) Variables are height (independent variable) and weight (dependent variable) of women swimmers. _ _ _ _ _ 2 x y x – x y – y (x – x) (y – y) (x – x ) _ _ 68 132 3 15 45 9 ∑ (x – x) (y – y) = 110 64 108 -1 -9 9 1 _ 2 62 102 -3 -15 45 9 ∑ (x – x ) = 20 65 115 0 -2 0 0 66 128 1 11 11 1 b = 110/20 = 5.5 1 _ _ b = y – b x = 117- 5.5(65) 0 1 = - 240.5 The regression equation is: y = - 240.5 + 5.5 x

  21. Sample Problem Continued The regression equation is: ŷ = - 10.16 + 0.184 x For x = 120, ŷ = - 10.16 + 0.184 (120) = 11.92 The bonus of a vice presdient whose salary is $120,000 is predicted to be $11,920.

  22. Coefficient of Determination In regression analysis, the following natural question arises: How well does the regression equation forecast the actual data? This question is answered by a quantity that we call “Coefficient of Determination”

  23. Coefficient of Determination To understand the Coefficient of Determination, one has to understand the following three quantities: 2 SSE = Sum of Squares due to Error = ∑ (y – ŷ) _ 2 SST = Sum of Squares Total = ∑ (y – y) _ 2 SSR = Sum of Squares due to Regression = ∑ (ŷ – y)

  24. Coefficient of Determination • Relationship Among SST, SSR, SSE SST = SSR + SSE where: SST = total sum of squares SSR = sum of squares due to regression SSE = sum of squares due to error

  25. Coefficient of Determination • The coefficient of determination is: r2 = SSR/SST where: SSR = sum of squares due to regression SST = total sum of squares

  26. Coefficient of Determination r2 = SSR/SST = 100/114 = .8772 The regression relationship is very strong; 87.7% of the variability in the number of cars sold can be explained by the linear relationship between the number of TV ads and the number of cars sold.

  27. where: b1 = the slope of the estimated regression equation Sample Correlation Coefficient

  28. The sign of b1 in the equation is “+”. Sample Correlation Coefficient rxy = +.9366

  29. Testing for Significance To test for a significant regression relationship, we must conduct a hypothesis test to determine whether the value of b1 is zero. Two tests are commonly used: t Test F Test and Both the t test and F test require an estimate of s2, the variance of e in the regression model.

  30. Testing for Significance • An Estimate of s2 • The mean square error (MSE) provides the estimate • of s2, and the notation s2 is also used. s2 = MSE = SSE/(n - 2) where:

  31. Testing for Significance • An Estimate of s • To estimate s we take the square root of s 2. • The resulting s is called the standard error of • the estimate.

  32. Testing for Significance: t Test • Hypotheses • Test Statistic where

  33. Testing for Significance: t Test • Rejection Rule Reject H0 if p-value <a or t< -tor t>t where: tis based on a t distribution with n - 2 degrees of freedom

  34. Testing for Significance: t Test 1. Determine the hypotheses. a = .05 2. Specify the level of significance. 3. Select the test statistic. 4. State the rejection rule. Reject H0 if p-value < .05 or |t| > 3.182 (with 3 degrees of freedom)

  35. Testing for Significance: t Test 5. Compute the value of the test statistic. 6. Determine whether to reject H0. t = 4.541 provides an area of .01 in the upper tail. Hence, the p-value is less than .02. (Also, t = 4.63 > 3.182.) We can reject H0.

  36. Sample Problem Problem # 18 (10-Page 564; 11-Page 581) 2 _ 2 _ 2 x y ŷ (y – ŷ) (y – y) (ŷ – y) 2.6 3300 3301 1 122500 121801 3.4 3600 3766 27556 2500 13456 3.6 4000 3882 13924 122500 53824 3.2 3500 3650 22500 22500 0 3.5 3900 3824 5776 62500 30276 2.9 3600 3476 15376 2500 30276 Total 85133 335000 249633

  37. Sample Problem Continued SSE = 85133 SST = 335000 SSR = 249633 2 Coefficient of Determination = r = SSR/SST = 0.745 2 Correlation Coefficient = r = √ r = 0.86 There is a fairly good fit between the actual monthly salaries and the GPAs of students. There exists a strong relationship between these variables.

  38. End of Chapter 14

More Related