1 / 78

Simple and multiple regression analysis in matrix form

Simple and multiple regression analysis in matrix form. Least square Beta estimation Simple linear regression Multiple regression with two predictors Multiple regression with three predictors Sum of square R 2 Test on b parameters Covariance matrix of the b Standard error of the b.

Download Presentation

Simple and multiple regression analysis in matrix form

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Simple and multiple regression analysis in matrix form • Leastsquare • Betaestimation • Simple linear regression • Multiple regression with two predictors • Multiple regression with three predictors • Sum of square • R2 • Test on bparameters • Covariance matrix of the b • Standard error of the b

  2. Simple and multiple regression analysis in matrix form • Tests on individual predictors • Variance of individual predictors • Correlation between predictors • Standardized matrices • Correlation matrices • Sum of squares in Z • R2 in Z • R2between independent variables • Standard error of b in Z

  3. Least square Starting from the general: The method of least squares estimate of the beta parameter minimizing the sum of squares due to error. In fact, if:

  4. Least square You can estimate:

  5. Simple linear regression

  6. Simple linear regression

  7. Simple linear regression

  8. Simple linear regression intercepts slope

  9. Multiple regression • Similar to the simple • A single dependent variable (Y) • Two or more independent variables (X) • Multiple correlation (rather than simple) • Estimation by least squares

  10. Multiple regression Simple linear regression (var.: 1 dep., 1 indep.) Independent variables intercepts slope error Multiple linear regression (Var.:1 dep., 2 indep.)

  11. Multiple regression matrix form

  12. Multiple regression matrix form

  13. Multiple regression matrix form

  14. Multiple regression matrix form X’X inversa

  15. Multiple regression matrix form

  16. Multiple regression with three predictors In matrix notation is briefly expressed :

  17. Multiple regression with three predictors

  18. Matrix form

  19. Matrix form

  20. Matrix form

  21. Matrix form

  22. General scheme

  23. General scheme

  24. Sum squares The least squares method allows to check the following equality:

  25. Sum squares Since in general: it's possible to derive that the sum of the squares of the distances of y from its average can be decomposed into the sum of squares due to regression and the sum of squares due to error, according to:

  26. Sum squares It should be noted the equivalence of :

  27. Sum squares

  28. Sum squares

  29. Sum squares In summary :

  30. R2

  31. Adjusted R2YY’ Because the coefficient of determination depends on both the number of observations (n) that the number of independent variables (k) it is convenient to correct by the degrees of freedom. Adjusted R2YY’ In our example :

  32. Test on b parameters • Once a regression model has been constructed, it may be important to confirm the goodness of fit(R-squared )of the model and the statistical significance of the estimated parameters. Statistical significance can be checked by an F-testof the overall fit, followed by t-tests of individual parameters

  33. Test on b parameters • You can test the hypothesis of differences with 0 of the parameters bi taken together :

  34. Test on b parameters k= Number of columns of the matrix X excluding X0 n= Number of observations in y

  35. Test on b parameters k= Number of columns of the matrix X excluding X0 n= Number of observations in y

  36. Covariance matrix of the b We denote: An estimate of the covariance matrix of the beta values result by:

  37. Covariance matrix of the b Where the diagonal elements are an estimate of the variance of the single bi

  38. Standard error of the b The standard error of the parameters can be calculated with the following formula: whereciiis the diagonal element inside the matrix(X’X)-1 corresponding to the parameter bi .

  39. Standard error of the b Nota: quando il valore di cii è elevato il valore di sebi cresce, indicando che la variabile Xi ha un alto coefficiente di correlazione multipla con le altre variabili X.

  40. Standard error of the b The standard error of the i can also be calculated in the following way: where the increase in R2i led to a decreases of the denominator of the ratio and, consequently, increases the value of the standard error of the parameterbi.

  41. Tests on individual predictors • With the standard error of measurement associated with each biyou can make a t-test to verify:

  42. Tests on individual predictors With the standard error of measurement associated with each bi is also possible to estimate the confidence interval for each parameter:

  43. Tests on individual predictors In order to conduct a statistical test on the regression coefficients is necessary: • Calculate the SSreg for the model containing all the independent variables. • Calculate the SSreg for the model excluding the variable for which you want to test the significance (SS-i). • Perform an F-test with the numerator equal to the difference SSreg-SSi weighted for the difference between the degrees of freedom of the two models, and with denominator SSREs / (nk-1).

  44. Tests on individual predictors To test, for example, only the weight of the first predictor compared to the total model, it is necessary to calculate a new matrix bi from the matrix Xi which was taken off the column belonging to the first predictor. From this follows immediately the calculation of SSi.

  45. Tests on individual predictors

  46. Tests on individual predictors Similarly we have: Same procedure is followed to test any subset of predictors.

  47. Tests on individual predictors It is interesting to note that this test on a single predictor is equivalent to the t-test b1 = 0. When the numerator there is only one degree of freedom, that is in fact the equivalence:

  48. Summary table On this occasion, none of the estimated parameters obtained statistical significance on the hypothesis bi 0

  49. Variance of individual predictors Xi Using the matrix X'X we can calculate the variance of each variable Xi .

More Related