1 / 37

Fundamentals of Quantitative Research Design

Fundamentals of Quantitative Research Design. Research Process and Design Spring 2006 Class #3. Today’s objectives. Introduce issues fundamental to research design Variables and variability Design validity

archana
Download Presentation

Fundamentals of Quantitative Research Design

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Fundamentals of Quantitative Research Design Research Process and Design Spring 2006 Class #3

  2. Today’s objectives • Introduce issues fundamental to research design • Variables and variability • Design validity • Learn how to critique research articles - using Antonio et. al. as an example (pull out handout and article) • Apply knowledge of internal and external validity • Work on purpose statements and research questions/hypothesis for research project Research Process and Design (Umbach)

  3. Collect data Analyze and present data Interpret findings Research Process (M & S, p. 11) Select a general problem Conduct literature review State conclusion/ generalization about problem Preliminary search, later expanded Exhaustive review Select specific problem, research question, or hypothesis Statistical tables Integrative diagrams Decide design and methodology Research Process and Design (Umbach)

  4. Problem Formulation in Quantitative Research • Phrased as statements, questions or hypotheses • Provides identification of population, variables, and logic of problem • Presents logic of constructs, variables, and operational definitions Research Process and Design (Umbach)

  5. Construct • Complex abstraction not directly observable • e.g., motivation, meta-cognition, self-concept, aptitude, etc. • Derived from theory • Expresses idea behind a set of particulars • Can combine several variables into meaningful patterns Research Process and Design (Umbach)

  6. Critique of Antonio et. al • Identify the hypotheses tested (if any) and discuss how the author(s) derived the hypotheses from theory. (Is the deduction logical?) • Describe briefly the theory (if any) on which the author(s) have based the study. Research Process and Design (Umbach)

  7. Variable • An event, category, behavior or attribute • Composed of attributes of levels that express a construct • Each variable a separate and distinct phenomenon • Two types based on what is measured • Categorical variables—groups variable into attributes (categories) • Continuous measured variable—can assume an infinite number of values within a range Research Process and Design (Umbach)

  8. Research Variable Types • Independent—comes first—influences or predicts • Also called manipulated or experimental variable • Can be an attribute • Antecedent • Dependent—comes second—if affected or predicted by independent variable • Consequence Research Process and Design (Umbach)

  9. Other Research Variable Types • Control—type of IV that is measured because it may influence the dependent variable. Effect of variable accountend for by inclusion in statistical analysis. • Confounding (extraneous)—not measured but may have an influence on DV Research Process and Design (Umbach)

  10. Treatment A/B Achievement Intelligence/ ability Research Process and Design (Umbach)

  11. Critique of Antonio et. al • Identify the independent and dependent variables. Research Process and Design (Umbach)

  12. Sources of Variability • Variability: the extent to which observations of something take on different values • Gender: males and females • Achievement: test scores ranging from 0 – 100 • Attitudes toward school: negative to positive Research Process and Design (Umbach)

  13. Controlling Variability through Design • MAXMINCON Principle • Maximize experimental variation • Design measures that provide sufficient variability • Sample to provide sufficient variability • Design interventions to be very different Research Process and Design (Umbach)

  14. Minimize Error Variation • Error--sampling and measurement error and other kinds of random events that make it difficult to show relationships • Use measures with high reliability • Aggregate individual scores into group scores • Use large samples • Assure standardization in implementing the intervention in an experiment Research Process and Design (Umbach)

  15. Control Extraneous (Confounding) Variation • Factors that affect relationships directly rather than in a random way--caused by variables that are outside the experiment • Make potential confounding variables constant • Use random assignment; matching if random assignment is not possible • Build possible confounding variable into the design as another independent variable • Use statistical adjustment procedures to help control the effect of confounding variables Research Process and Design (Umbach)

  16. Design Validity • Extent to which the results of an experiment match the reality of the world • Types • Construct validity • Statistical conclusion validity • Internal validity • External validity Research Process and Design (Umbach)

  17. Construct Validity • Judgment about the extent to which interventions and measured variables actually represent theoretical constructs • Closely related to generalizability • Threats (Shadish, Cook and Campbell, 2002) • Inadequate explication of the constructs • Mono-operation bias • Mono-method bias Research Process and Design (Umbach)

  18. Statistical Conclusion Validity • Appropriate use of statistical tests to determine if relationships are actual relationships • Threats (Shadish, Cook and Campbell, 2002) • Low statistical power • Violated assumptions of statistical tests • Fishing and error rate problem • Unreliability of measures • Restriction of range • Unreliability of treatment implementation • Extraneous variance in the experimental setting Research Process and Design (Umbach)

  19. Internal Validity • When study’s design effectively controls possible sources of error, so that sources are not related to study’s results Research Process and Design (Umbach)

  20. Threats to Internal Validity (Shadish, Cook and Campbell, 2002) • History: extraneous incidents or events that affect the results • Selection • A difference between or among groups usually as a result of non-random assignment to groups • Relates to the manner by which subjects were assigned to groups, not how they were selected from the population Research Process and Design (Umbach)

  21. Threats to Internal Validity • Statistical regression • Movement of extremely unusual scores (high or low) to the average • Pretesting • Having taken a pretest influences the results • Instrumentation • Data collection procedures and/or instruments affect the results through low validity or reliability, floor or ceiling effects, etc. • Attrition • Differential loss of subjects from each group Research Process and Design (Umbach)

  22. Threats to Internal Validity • Maturation • Changes in the subjects that affect performance on the dependent variable • e.g., preschool children’s academic development will be affected by their maturation over the course of a year • Diffusion of treatment • Subjects in both the control and experimental groups are exposed to the experimental treatment Research Process and Design (Umbach)

  23. Threats to Internal Validity • Experimenter effect • Deliberate and unintentional influences (positive and negative) that the researcher has on the subjects • Treatment replications • Independence of observations • A particular concern in education where “classes” rather than individuals receive treatments Research Process and Design (Umbach)

  24. Threats to Internal Validity • Subject effects • Changes in the subjects behavior (positive or negative) in response to the research situation • Initiated by the subjects themselves • e.g., John Henry effect – control group tries to outperform experimental group Research Process and Design (Umbach)

  25. External Validity • Generalizability of results to other people, settings, and times • Types • Population • Ecological Research Process and Design (Umbach)

  26. Population External Validity • The extent to which the results are generalizable to and across populations • To means from one population to another population (e.g., fifth graders to sixth graders) • Across means from one subgroup in a population to another subgroup in that population (e.g., across males and females within the population) Research Process and Design (Umbach)

  27. Ecological External Validity • The extent to which the results are generalizable to similar conditions • Settings, environments, times • Factors • Multiple treatment interference • Physical surroundings • Time of day or year • Pretest or posttest sensitization • Effects due to experimenter or treatment • Hawthorne effect Research Process and Design (Umbach)

  28. Threats to Internal Validity EXERCISE: Identify threats to validity in the studies provided • History • Selection • Statistical regression • Pretesting • Instrumentation • Subject attrition • Maturation • Diffusion of treatment • Experimenter effects • Subject effects • Treatment replications Threats to External Validity • Population external validity • Ecological external validity Research Process and Design (Umbach)

  29. Critique of Antonio et. al • Describe the treatment (if any) and indicate whether the author(s) provide sufficient information to replicate the study. If there is insufficient information to replicate the study, what additional information do you think the authors should supply? • Identify the study design (e.g., experimental, quasi-experimental, ex post facto, correlational) and explain why it is the design you say it is. Research Process and Design (Umbach)

  30. Critique of Antonio et. al • Discuss the threats to internal and external validity not controlled for by the design. Be specific when naming these threats and clearly identify those that are, in your judgment, the most serious. • In what ways, if any, might the researcher(s) made this study stronger? (e.g, how could the internal and external validity been increased?). • Given the study as it is, how well do the data and results support the conclusions drawn? (i.e., to what extent are the researchers justified in making the conclusions they do? Research Process and Design (Umbach)

  31. Elements of a purpose statement • Words to signal purpose statement (e.g., purpose, intent, objective) • Identification of a theory, model, conceptual framework to test • Identify independent and dependent variables as well as control variables • Use words to connect IV and DV (e.g., relationship between x and y, comparison of group A and group B, effect of x on y) • Mention design and participants Research Process and Design (Umbach)

  32. Things to consider when formulating hypothesis or research questions • Identify IVs and DVs • Based on theory • Use on or the other Research Process and Design (Umbach)

  33. Research questions • Questions—simple and direct • Descriptive—typically asks “what is” and implies a survey research design • e.g., What is current dropout rate in Louisiana? • Relationship—implies a correlational design • e.g., What is relationship between math attitude and math achievement? • Difference—implies a comparison • e.g., Is there a difference in effectiveness of graded and non-graded homework? Research Process and Design (Umbach)

  34. Research Hypotheses • Null – No relationship or no difference exists between groups on a variable. • Alternative – based on previous research, the researcher makes a prediction about the outcome of the study • Directional • Nondirectional • Should be testable, verifiable • Concise and lucid Research Process and Design (Umbach)

  35. Specific Research Question or Hypothesis • Does specific research question or hypothesis state concisely what is to be determined? • Does level of specificity indicate question or hypothesis researchable? Do variables seem amenable to operational definitions? • Is logic clear? Are variables identified? • Does research question or hypothesis indicate framework for reporting results? Research Process and Design (Umbach)

  36. Group project • Use Creswell’s purpose statement script (p. 96-97) to begin crafting the purpose statement. • Try to develop one research question or hypothesis. Research Process and Design (Umbach)

  37. For next week… • Understanding data (Scales of measurement, measures of central tendency, measures of variability) • Writing the literature review • DUE: 2-page prospectus (What? Why? How? Who?) • READ: Creswell Ch. 2; Jaeger Chs. 1-3 Research Process and Design (Umbach)

More Related