1 / 16

EVALUATING YOUR RESEARCH DESIGN

EVALUATING YOUR RESEARCH DESIGN. EDRS 5305 EDUCATIONAL RESEARCH & STATISTICS. Campbell and Stanley (1966). Two general criteria of research designs: Internal validity External validity. INTERNAL VALIDITY.

holmes-soto
Download Presentation

EVALUATING YOUR RESEARCH DESIGN

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. EVALUATING YOUR RESEARCH DESIGN EDRS 5305 EDUCATIONAL RESEARCH & STATISTICS

  2. Campbell and Stanley (1966) Two general criteria of research designs: Internal validity External validity

  3. INTERNAL VALIDITY • Definition: refers to the extent to which the changes observed in the DV are caused by the IV.

  4. Internal Validity • ?s of internal validity cannot be answered positively unless the design provides adequate control of extraneous variables. • Essentially a problem of control. • Anything that contributes to the control of a research design contributes to its internal validity.

  5. Internal Validity • History: specific events or conditions, other than the treatment, may occur between the 1st and 2nd measurements of the participants to produce changes in the DV. • Maturation: processes that operate within the participants simply as a function of the passage of time.

  6. Internal Validity • Pretesting: exposure to a pretest may affect participants’ performance on a 2nd test, regardless of the IV. • Measuring instruments: changes in the measuring instruments, in the scorers, or in the observers used may produce changes in the obtained measures.

  7. Internal Validity • Statistical regression: If groups are selected on the basis of extreme scores, statistical regression may operate to produce an effect that could be mistakenly interpreted as an experimental effect.

  8. Internal Validity • Differential selection of participants: important differences may exist between the groups before the IV is applied. • Experimental mortality: occurs when there is a differential loss of respondents from the comparison groups.

  9. Internal Validity • Selection-maturation interaction: Some of these internal validity threats may interact. Frequently arises when volunteers are compared with nonvolunteers.

  10. Internal Validity • Implementation: sometimes implementing the IV threatens internal validity. Experimenter bias effect • Participants’ attitudes: Hawthorne effect-attention was positive; John Henry effect-exert extra effort

  11. Controlling for Threats to Internal Validity • Random assignment • Randomized matching: match on as many variables as possible and then randomly assign one member of the pair to the IV-other goes to the control group.

  12. Homogeneous selection: select samples that are as similar as possible on some extraneous variable (e.g., IQ; age) • Building variables into the design: include the extraneous variable as one of the IVs examined (e.g., gender) • Analysis of covariance: removing portion of performance that is systematically related to an extraneous variable. • Using participants as their own controls: participants are in each of the experimental conditions, one at a time.

  13. External Validity of Research Designs • Refers to generalizability or representativeness of the findings. • Question addressed here is: • To what groups, settings, experimental variables, and measurement variables can these findings be generalized?

  14. Types of External Validity • Population external validity: identifying the population to which results may be generalizable. • Ecological external validity: concerned with generalizing experimental effects to other environmental conditions (i.e., settings).

  15. Types of External Validity • External validity of operations: concerned with how well the operational definitions and the experimental procedures represent the constructs of interest. Would the same relationships be found if a different researcher used different operations (i.e., measures) in investigating the same question?

More Related