Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.
Statistical Inference on correlation and regression • The shortest distance is the one that crosses at 90° the vector u
Statistical Inference on Correlation Angle between two variables Relationship between two variables • The shortest distance is the one that crosses at 90° the vector u
Statistical Inference on Correlation • The null hypothesis is that there is no correlation between the two variables in the population. In other words, we seek to know if the two variables are linearly independent. If the hull hypothesis is rejected, then it means that the two variables are not independent and that there is a linear relationship between the two.
Statistical Inference on Correlation • Example In this case, we cannot use the standard normal distribution (Z distribution). We will use the F ratio distribution instead. (see pdf file)
Statistical Inference on Correlation Nb of variables - 1 • Example Nb of participants- 2
Statistical Inference on Correlation • Example Because Fxy > F(0.05, 1, 3) (10.3>10.13) we reject H0 and therefore accept H1. There is a linear dependency between the two variables.
Linear regression • We want a functional relationship between 2 variables; not only a strength of association. • In other words, we want to be able to predict the outcome given a predictor Recall: finding the slope and the constant of a line y1 • The shortest distance is the one that crosses at 90° the vector u x1
e Linear regression • Regression: b
Linear regression • Regression: The formula to obtain the regression coefficients can be deducted directly from geometry (true for 2 variables only) • By substitution, we can isolate the b1 coefficient. • If we generalized to any situation (multiple, multivariate)
Parameters of the linear regression Equation of prediction If we replace b0
Note We know that: If we replace the covariance we then obrain: