90 likes | 208 Views
This lecture focuses on the analysis of curvilinear relationships through various polynomial functions such as quadratic and cubic. It discusses the importance of centering and orthogonality, which involves adjusting scores by subtracting the mean. Additionally, it addresses nonlinear transformations including logarithmic, exponential, and square root transformations to manage issues like heteroscedasticity and skewed data distributions. Techniques such as the arcsin transformation and logit transformation for binary outcomes are also explored, providing a comprehensive overview for effective data analysis.
E N D
LECTURE 6 CURVILINEAR RELATIONSHIPS AND TRANSFORMATIONS
POWER POLYNOMIALS • Types of curve functions of interest: • Quadratic • Cubic
C1 3 2 1 0 -1 -2 -3 3 2 1 0 -1 -2 -3 C3 3 2 1 0 -1 -2 -3 C2 0 100 200 300 0 100 200 300 0 100 200 300 Fig. 9.2: Graphs of planned orthogonal contrasts for four interval treatments
Centering and Orthogonality • Grand mean centering: subtract mean from each score x1 = x-meanx • Create quadratic variable by squaring centered variable: x2= (x1)2 This results in rx1x2 << rxx2 Reduces multicollinearity in predictors
Nonlinear Transformations • Lowess lines indicate various nonlinear conditions: heteroscedasticity, nonnormal distribution of errors, nonlinear variable relationships
Nonlinear Transformations • Exponential relations: y=e-x • Use logarithmic transformation: log(y) = -x y log(y) X X
Heteroscedastic, Skewed • Square root transformation: y=bx1 + b0 x’ = x pr(x) pr( x) x x
Arcsin transformation • Linearize proportion variables (eg. %getting particular item correct predicted by total test score ) • 2*Arcsin(p) P(item i=1) 2arcsin(p) Test score x Test score x
Logit transformation • Use when p is binary outcome (pass-fail) • L= ½ ln (p/(1-p))