1 / 40

Introduction to Multiple Regression: Model with Two Explanatory Variables

This chapter provides an introduction to multiple regression analysis and focuses on the model with two explanatory variables. It discusses the assumptions, estimators, computational procedure, and the interpretation of the results.

bleclerc
Download Presentation

Introduction to Multiple Regression: Model with Two Explanatory Variables

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Chapter 4 Multiple Regression

  2. 4.1 Introduction • The errors are again due to measurement errors in y and errors in the specification of the relationship between y and the x’s. • We make the same assumptions about that we made in Chapter 3. These are:

  3. 4.1 Introduction • for all i. • and are independent foe all . • and are independent foe all i and j. • are normally distributed for all i .

  4. 4.1 Introduction • There are no linear dependencies in the explanatory variables, i.e., none of the explanatory variables can be expressed as an exact linear function of the others. (This assumption will be relaxed in Chapter 7.) Also, it will be assumed that is a continuous variables. (The case where it is observed as a dummy variable or as a truncated variable will be discussed in Chapter 8.)

  5. 4.2 A Model with Two Explanatory Variables • Consider the model (4.1) • The assumptions we have made about the error term u imply that

  6. 4.2 A Model with Two Explanatory Variables • Let , ,and be the estimators of , ,and , respectively. • The sample counterpart of is the residual • The three equations to determine , , and are obtained by replacing the population assumptions by their sample counterparts:

  7. 4.2 A Model with Two Explanatory Variables

  8. 4.2 A Model with Two Explanatory Variables The Least Squares Method • The least square method says that we should choose the estimators , , of , , so as to minimize • Differentiate Q with respect to , , and and equate the derivatives to zero.

  9. 4.2 A Model with Two Explanatory Variables • We get

  10. 4.2 A Model with Two Explanatory Variables • We can simplify this equation by the use of the following notation. • Let us define

  11. 4.2 A Model with Two Explanatory Variables • Now we can solve these two equations to get and . We get (4.8) Where . • Once we obtain and we get from equation (4.5). We have

  12. 4.2 A Model with Two Explanatory Variables Thus the computational procedure is as follows: • Obtain all the means: , , . • Obtain all the sums of squares and sums of products: , , ,and so on. • Obtain S11, S12 , S22 , S1y , S2y , and Syy. • Solve equations (4.7) and (4.8) to get and . • Substitute these in equation (4.5) to get .

  13. 4.2 A Model with Two Explanatory Variables

  14. 4.2 A Model with Two Explanatory Variables

  15. 4.2 A Model with Two Explanatory Variables • If , then is an unbiased estimator for . • If we substitute for in the expressions in result 2, we get the estimated variances and covariances. • The square roots of the estimated variances are called the standard errors (denoted SE). • Then each have a t-distribution with d.f. ( n – 3 ). An example

  16. 4.2 A Model with Two Explanatory Variables • Note that the higher the value of (other things staying the same), the higher the variances of and . • If is very high, we cannot estimate and with much precision.

  17. 4.2 A Model with Two Explanatory Variables • In the case of simple regression we also defined the following: residual sum of squares = explained sum of squares =

  18. 4.2 A Model with Two Explanatory Variables • The analogous expressions in multiple regression are explained sum of squares =

  19. 4.2 A Model with Two Explanatory Variables • is called the coefficient of multiple determination and its positive square root is called the multiple correlation coefficient. • The first subscript is the explained variable. • The subscripts after the dot are the explanatory variables. • To avoid cumbersome notation we have written 12 instead of x1x2. • Since it is only x’s that have subscripts, there is no confusion in this notation.

  20. 4.5 Partial Correlations and Multiple Correlation • If we have explained variable y and three explanatory variables x1, x2, x3 and , , are the squares of the simple correlations between y and x1, x2, x3, respectively, then , , and measure the proportion of the variance in y that x1 alone, x2 alone, or x3 alone explain. • On the other hand, measures the proportion of the variance of y that x1, x2, x3together explain. • The relationship between simple and multiple correlations?

  21. 4.5 Partial Correlations and Multiple Correlation • We would also like to measure something else. • For instance, how much does x2 explain afterx1 is included in the regression equation? • How much does x3 explain afterx1 and x2 are included? • These are measured by the partial coefficients of determination and , respectively. • The variables after the dot are the variables already included.

  22. 4.5 Partial Correlations and Multiple Correlation • With three explanatory variables we have the following partial correlations: These are called partial correlations of the first order. • We also have three partial correlation coefficients of the second order: • The variables after the dot are always the variables already included in the regress equation.

  23. 4.5 Partial Correlations and Multiple Correlation • The order of partial correlation coefficient depends on the number of variables after the dot. • The usual convention is to denote simple and partial correlations by a small r and multiple correlations by a capital R. • For instance, are all coefficients of multiple determination (their positive square roots are multiple correlation coefficients.)

  24. 4.5 Partial Correlations and Multiple Correlation • Partial correlations are very important in deciding whether or not to include more explanatory variables. • For instance, suppose that we have two explanatory variables x1and x2, and is very high, say 0.95, but is very low, say 0.01. • What this means is that if x2alone is used to explain y, it can do a good job.

  25. 4.5 Partial Correlations and Multiple Correlation • But after x1is included, x2does not help any more explaining y; that is, x1has done the job of x2 . • In this case there is no use including x2. • In face, we can have a situation where, for instance, but

  26. 4.5 Partial Correlations and Multiple Correlation • In this case each variable is highly correlated with y but the partial correlations are both very low. • This is called multicollinearity and we will discuss this problem later in Chapter 7. • In this example we can use x1 only or x2 only or some combination of the two as an explanatory variable.

  27. 4.5 Partial Correlations and Multiple Correlation • For instance, suppose that x1 is the amount of skilled labor, x2 the amount of unskilled labor, and y the output. • What the partial correlation coefficients suggest is that the separation of total labor into two components -- skilled and unskilled -- does not help us much in explaining output. • So we might as well use x1 +x2 or total labor as the explanatory variable.

  28. Assignment • The data from the teacher’s web site • Calculate the following three types of correlation • Multiple correlation • Simple correlation • Partial correlation

  29. 4.9 Omission of Relevant Variables and Inclusion of Irrelevant Variables • Until now we have assumed that the multiple regression equation we are estimating includes all the relevant explanatory variables. • In practice, this is rarely the case. • Sometimes some relevant variables are not included due to oversight or lack of measurements. • At other times some irrelevant variables are included. • What we would like to know is how our inferences change when these problems are present.

  30. 4.9 Omission of Relevant Variables and Inclusion of Irrelevant Variables Omission of Relevant Variables • Let us first consider the omission of relevant variables. Suppose that the true equation is • Instead, we omit x2 and estimate the equation • This will be referred to as the “misspecified model.” • The estimate of we get is

  31. 4.9 Omission of Relevant Variables and Inclusion of Irrelevant Variables • Substituting the expression for y from equation (4.15) in this, we get Since we get Where is the regression coefficient from a regression of x2 on x1.

  32. 4.9 Omission of Relevant Variables and Inclusion of Irrelevant Variables • Thus is a biased estimator for and the bias is given by bias = ( coefficient of the excluded variable) × ( regression coefficient in a regression of the excluded variable on the included variable) • If we denote the estimator for from equation (4.15) by , the variance of is given by where

  33. 4.9 Omission of Relevant Variables and Inclusion of Irrelevant Variables • On the other hand, • Thus is a biased estimator but has a smaller variance than . • In fact, the variance would be considerably smaller if is high. • However, the estimated standard error need not be smaller for than for .

  34. 4.9 Omission of Relevant Variables and Inclusion of Irrelevant Variables • This is because , the estimated variance of the error, can be higher in the misspecified model. • It is given by the residual sum of squares divided by degrees of freedom, and can be higher (or lower) for the misspecified model.

  35. 4.9 Omission of Relevant Variables and Inclusion of Irrelevant Variables Inclusion of Irrelevant Variables • Consider now the case of inclusion of irrelevant variables. Suppose that the true equation is , but we estimate the equation • The least squares estimators and from misspecified equation are given by

  36. 4.9 Omission of Relevant Variables and Inclusion of Irrelevant Variables • The least squares estimators and from misspecified equation are given by where ,and so on. • Since we have • Hence we get • Thus we get unbiased estimates for both the parameters.

  37. 4.9 Omission of Relevant Variables and Inclusion of Irrelevant Variables • This result, coupled with the earlier results regarding the bias introduced by the omission of relevant variables might lead us to believe that it is better to include variables (when in doubt) rather than exclude them. • However, this is not so, because though the inclusion of irrelevant variables has no effect on the bias of the estimator, it does affect the variances.

  38. 4.9 Omission of Relevant Variables and Inclusion of Irrelevant Variables • The variance of , the estimator of β1 from the correct equation is given by • On the other hand, from the misspecified equation we have where r12 is the correlation between x1 and x1 .

  39. 4.9 Omission of Relevant Variables and Inclusion of Irrelevant Variables • Thus unless r12 =0. • Hence we will be getting unbiased but inefficient estimates by including the irrelevant variable. • An example: omit or Include Variables

More Related