180 likes | 403 Views
Applied Econometrics. 9. Hypothesis Tests: Analytics and an Application . General Linear Hypothesis. Hypothesis TestingAnalytical framework:y = X? ?Hypothesis:R? - q = 0, J linear restrictions. Procedures. Classical procedures based on
E N D
1. Applied Econometrics William Greene
Department of Economics
Stern School of Business
2. Applied Econometrics 9. Hypothesis Tests: Analytics
and an Application
3. General Linear Hypothesis Hypothesis Testing
Analytical framework: y = X? + ?
Hypothesis: R? - q = 0,
J linear restrictions
4. Procedures Classical procedures based on two algebraically equivalent frameworks
Distance measure: Is Rb - q = m 'far' from zero? (It cannot be identically zero.)
Fit measure: Imposing R? - q on the regression must degrade the fit (e'e or R2). Some degradation is simple algebraic.
Is the loss of fit 'large?'
In both cases, if the hypothesis is true, the answer will be no.
5. Test Statistics Forming test statistics:
For distance measures use Wald type of distance measure, W = (1/J) m?[Est.Var(m)]m
For the fit measures, use a normalized measure of the loss of fit:
[(R2 - R*2)/J]
F = -----------------------------
[(1 - R2)/(n - K)]
6. Testing Procedures How to determine if the statistic is 'large.'
Need a 'null distribution.' Logic of the Neyman-Pearson methodology.
If the hypothesis is true, then the statistic will have a certain distribution. This tells you how likely certain values are, and in particular, if the hypothesis is true, then 'large values' will be unlikely.
If the observed value is too large, conclude that the assumed distribution must be incorrect and the hypothesis should be rejected.
For the linear regression model, the distribution of the statistic is F with J and n-K degrees of freedom.
7. Distribution Under the Null
8. Particular Cases Some particular cases:
One coefficient equals a particular value:
F = [(b - value) / Standard error of b ]2 = square of familiar t ratio.
Relationship is F [ 1, d.f.] = t2[d.f.]
A linear function of coefficients equals a particular value
(linear function of coefficients - value)2
F = ----------------------------------------------------
Variance of linear function
Note square of distance in numerator
Suppose linear function is ?k wk bk
Variance is ?k?l wkwl Cov[bk,bl]
This is the Wald statistic. Also the square of the somewhat
familiar t statistic.
3. Several linear functions. Use Wald or F. Loss of fit measures may be easier to compute.
9. Application: Cost Function
10. Regression Results
11. Price Homogeneity: Only Price Ratios Matter ß2 + ß3 + ß4 = 1. ß7 + ß8 + ß9 = 0.
12. Imposing the Restrictions
13. Wald Test of the RestrictionsChi squared = J*F
14. Test of HomotheticityCross Product Terms = 0
15. Testing Fundamentals - I SIZE of a test = Probability it will incorrectly reject a “true” null hypothesis.
This is the probability of a Type I error.
16. A Simulation Experiment
17. Simulation Results
18. Testing Fundamentals - II POWER of a test = the probability that it will correctly reject a “false null” hypothesis
This is 1 – the probability of a Type II error.
The power of a test depends on the specific alternative.
19. Power of a Test