Inference for Regression. Chapter 14. The Regression Model. The LSRL equation is ŷ = a + bx a and b are statistics; they are computed from sample data, therefore, we use them to estimate the true y-intercept, α , and true slope, β μ y = α + β x
Inference for Regression
The LSRL equation is
ŷ = a + bx
a and b are statistics; they are computed from sample data, therefore, we use them to estimate the true y-intercept, α, and true slope, β
μy = α + βx
a and b from the LSRL are unbiased estimators of the parameters α and β
Testing Hypotheses of No Linear Relationship
The null hypothesis
H0 : β = 0 A slope of 0 means (horizontal line) no correlation between x and y. The mean of y does not change at all when x changes.
The alternative hypothesis
Ha: β≠ 0 or Ha: β < 0 or Ha: β > 0
Negative slopePositive slope
When testing the hypothesis of no linear relationship a t statistic is calculated
In fact, the t statistic is just the standardized version of the least squares regression slope b.
so we use table C to look up t and find the p-value.
The P-value is still interpreted the same way.
b is the slope from the least squares regression line, SEb is the standard
error of the least-squares slope b.
t = or t =
σ, the standard error about the LSRL (about y) is estimated by
Minitab output always gives 2-sided p-value for Ha
If you want the p-value for alternative hypotheses of
Ha: β>0 or Ha: β< 0 just divide p-value from minitab by 2
Calculator gives you your choice
βis the most important parameter in regression problem because it is the rate of change of the mean response as explanatory variable x, increases.
CI for β b ± t* Seb estimate± t* SEb
SEb = =
t * look up on table C with n-2 degrees of freedom
You can also find CI for αsame way, using SEa
a ± t*Sea (not commonly used)