1 / 33

Cross-lagged Panel Regression, ANOVA in Latent Framework, and Specification of Unconditional LGC

ulmer
Download Presentation

Cross-lagged Panel Regression, ANOVA in Latent Framework, and Specification of Unconditional LGC

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


    1. Cross-lagged Panel Regression, ANOVA in Latent Framework, and Specification of Unconditional LGC

    2. The most simple model for examining latent change is a simple confirmatory factor analysis. In this model, we have measured one construct at four occasions. At each occasion, the construct is measured with the same three indicators. In order to ensure that we have measured each construct identically across time, we have constrained the lambda paths for each indicator to be invariant across time. In addition, the residual variances of each indicator (the same ones) will be correlated. The measure specific variances will be the same for each indicator across time. We can constrain these so that the residuals have the same value. If there is no latent change, each of these correlations should be 1.0. Any estimate less than this reflects latent change.The most simple model for examining latent change is a simple confirmatory factor analysis. In this model, we have measured one construct at four occasions. At each occasion, the construct is measured with the same three indicators. In order to ensure that we have measured each construct identically across time, we have constrained the lambda paths for each indicator to be invariant across time. In addition, the residual variances of each indicator (the same ones) will be correlated. The measure specific variances will be the same for each indicator across time. We can constrain these so that the residuals have the same value. If there is no latent change, each of these correlations should be 1.0. Any estimate less than this reflects latent change.

    3. A Longitudinal Simplex Structure Assume that latent change is .8 from one time to another. If this is true, change separated by two time points will be greater than between single occasions. The construct has changed twice. Using Wright’s rules, we know that change is multiplative. So if one time point separated is .8, two time point separated is .64, etc. If change is simple, and linear, we should expect this simple relationship. This is known as simplex structure.Assume that latent change is .8 from one time to another. If this is true, change separated by two time points will be greater than between single occasions. The construct has changed twice. Using Wright’s rules, we know that change is multiplative. So if one time point separated is .8, two time point separated is .64, etc. If change is simple, and linear, we should expect this simple relationship. This is known as simplex structure.

    4. A simplex solution is represented like this.A simplex solution is represented like this.

    5. When modeling latent change, we often want to model the latent means as well as the latent variances. Assuming a simplex 7 point increase in latent means, we should expect something like this.When modeling latent change, we often want to model the latent means as well as the latent variances. Assuming a simplex 7 point increase in latent means, we should expect something like this.

    6. Motivation for Multilevel Modeling One way to describe how developmental processes is untold over time (unconditional) and what variables influence the course of development Assumption that change for each individual on the phenomenon under study is systematically related the passage of time

    7. Planning Need to know something about phenomenon under study Its focus is on the form(shape) of the change, more time points needed Rule of Thumb of 4-5 points good for linear line – a good one “Good planning prior to data collection can save a lot of work and data analysis headaches when using LGC methodology.”

    8. Features of Model Latent variables (circles) estimated from observed variables (squares); three statistics produced -- “loadings” or regression weights from indicators for factors, residuals of indicators and residual of latent factor. Parameters of interest, means of the latent factors (intercepts or slopes) Slope and intercept latent factors allowed to co-vary Can put a fixed in time covariate into adjusted initial levels

    9. The models just presented use the structural equation framework that we have already worked through. There exists another way of representing latent change that uses a somewhat different framework which, conceptually, is very much like repeated measures ANOVA. In such models, we represent change as a latent variable that can be modeled. In the figure here, the change latent variable ‘causes’ change in each of the first order latent variables.The models just presented use the structural equation framework that we have already worked through. There exists another way of representing latent change that uses a somewhat different framework which, conceptually, is very much like repeated measures ANOVA. In such models, we represent change as a latent variable that can be modeled. In the figure here, the change latent variable ‘causes’ change in each of the first order latent variables.

    10. What is necessary to model change is two pieces of information. A latent intercept from which each time can vary from, and a latent slope that provides information about the change. The beta weights from the intercept to each of the occasions is the amount of variance in each time point that is associated with a constant, and each beta weight provides information about the change over time. This is a conceptual model, and is obviously not identified. How can you determine what is constant and what is change in this model?What is necessary to model change is two pieces of information. A latent intercept from which each time can vary from, and a latent slope that provides information about the change. The beta weights from the intercept to each of the occasions is the amount of variance in each time point that is associated with a constant, and each beta weight provides information about the change over time. This is a conceptual model, and is obviously not identified. How can you determine what is constant and what is change in this model?

    17. Multilevel Scientific Questions What is a person’s rate of change in (Y) school well-being? Does this rate differ over time among persons? Can we explain these differences in rates (and/or intercepts)? [“Growth Models”]

    18. Importance of Multilevel Models Take into account dependence among observations due to hierarchical structure of data Implications of ignoring hierarchical structure: Upwardly biased parameter estimates Standard errors inappropriate Inflated test statistics

    19. Why not OLS? OLS can incorporate only one random source at a time Within-group OLS Within-group: y = intercept + bx + error Between-group: b = intercept + BX + error The error from the first equation does not carry into the second equation

    20. Baseline Considerations Scores at least interval level The trend we are interested in cannot be observed directly, therefore in the SEM sense, each time point is an “indicator” of the “latent trend” Keep scores in their original metric (don’t standardize)

    21. Standardization Desirable at level 2 (highest level) Eliminate influence on parameter estimates of differences in variances of measures Undesirable at level 1 Standardization eliminates important information (e.g., means) If standardize in terms of grand mean, variances are confounded For scaled scores, if you have highly divergent question response scales, you may need to standardize items first. Risk = artificially stabilizing scale score variances. Better: 4+ item interval scales, all the same type of response scale.

    22. Centering Options Can be very important for level 1 Zero, group mean, grand mean Centering for level 2 (or highest level) usually does not make too much difference Keep in mind how changes in centering change meaning of coefficients

    23. Process - I First, run a totally unconditional model Estimates variance components May indicate where ‘the action is’ (review covariance matrix) Build models up, rather than build them down

    24. Process - II Level 1 variance accounted for may be unstable (unreliable) when multiple variables at level 1 Keep in mind the possibility of comparisons of fixed effects when designing models Use fit statistics to trim model as needed

    25. Stoolmiller Trajectory Equation

    26. Curran & Bollen

    27. Singer

    28. Duncan et. al.

    29. Walls, Jung and Schwartz

    30. General Linear Mixed Model

    31. School Well-being

    32. SAS PROGRAMS

    33. Unconditional Means Output

More Related