html5-img
1 / 24

Common Designs and Quality Issues in Quantitative Research

Common Designs and Quality Issues in Quantitative Research. Research Methods and Statistics. Intended Learning Outcomes. To familiarise yourself with the different types of quantitative research designs commonly used in occupational psychology research

kyne
Download Presentation

Common Designs and Quality Issues in Quantitative Research

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Common Designs and Quality Issues in Quantitative Research Research Methods and Statistics

  2. Intended Learning Outcomes • To familiarise yourself with the different types of quantitative research designs commonly used in occupational psychology research • To understand the concepts of validity and reliability and why these are important to consider when designing research studies

  3. What is Research Design “A design specifies the logical structure of a research project and the plan that will be followed in the execution. It determines whether a study is capable of obtaining an answer to the research question in a manner consistent with the appropriate research methodology and the theoretical and philosophical perspectives underlying the study.” (Sim & Wright, 2000: 27)

  4. Elements of Research Designs • phenomena/variables to be researched • how will these phenomena/variables be measured? (what method/technique?) • who/where will the data be collected from? • when will the data be collected? • what type of data will I have as a result? • what will be the consequences of this for data analysis?

  5. Elements of Research Designs • phenomena/variables to be researched • how will these phenomena/variables be measured? (what method/technique?) • who/where will the data be collected from? • when will the data be collected? • what type of data will I have as a result? • what will be the consequences of this for data analysis?

  6. Common Designs • Group differences • Relationships between variables: • correlations • regression models • Surveys / questionnaires • Time series • Other designs

  7. Group Differences INTERVENTION PRE INTERVENTION POST CONTROL PRE CONTROL POST e.g. to determine the effect of a training intervention on scores

  8. Group Differences Designs - Variations • No control group • More than two groups • More than one outcome measure • No time element • More than two time points • Etc.

  9. Relationships between Variables • Bivariate relationships • each participant is measured on two or more variables (either both are categorical or both are ordinal or above) • Regression models • based on linear correlations • various predictor variables and one outcome variable

  10. Bivariate Relationships – Categorical Data e.g. to find out whether the proportion of pupils with reading difficulties varies from public to private schools

  11. Bivariate Relationships – Ordinal, Interval or Ratio Data e.g. to find out how the amount of TV viewing is correlated with academic performance

  12. Regression PREVIOUS SAT SCORE FREE MEALS TV VIEWING ATTENDANCE RECORD GENDER ACADEMIC PERFORMANCE e.g. which are the best predictors of academic performance? e.g. which are the best predictors for whether a child will get a statement of educational needs?

  13. Surveys / Questionnaires • May be used: • as an outcome measure (evaluation) • to describe (the attitudes of) a particular group – SURVEY • Surveys can be used to check for: • differences between groups • relationships between variables

  14. Time Series • multiple data points (50+) – recorded data • useful for evaluation when trend and/or seasonality are existent

  15. Other Designs • Single case designs A A B B

  16. Which One to Choose…??? Your choice of study design needs to take into account: • your research question • data available / feasible • tests available • other details: • trends / seasonality existent?

  17. Making Sure your Study is a Good Quality One Just two thing to worry about… • High internal and external validity • Validity and reliability of instruments

  18. Internal and External Validity • Interval validity refers to the lack of confounding variables (related to design) (e.g. can we really conclude the children’s reading performance has improved because of our IV – intervention we introduced?) • External validity refers to whether we can generalise our results to our target population (related to sampling)

  19. Threats to Internal Validity REGRESSION TO THE MEAN MORTALITY COMPENSATORY RIVALRY MATURATION DIFFUSION OF BENEFIT EXPERIMENTER BIAS

  20. External Validity • Can we generalise our findings to other people/places/settings/conditions/etc.? • Related to: • artificiality does the experimental situation resemble the real world? • sample selection is our sample different from the population you want to apply our findings to?

  21. High Quality Instruments • Validity: Does your test measure what it claims to? • Reliability: Does it measure it consistently? Reproduced from Trochim (2002) on http://www.socialresearchmethods.net/kb/reliability.htm

  22. Relationship between Validity and Reliability random error RELIABILITY systematic error VALIDITY

  23. When Is Quality Compromised? • Ethics • Practical issues THINK ABOUT… • How do validity and ethics relate to one another? • Is it ethical to sacrifice validity in a study to make it more ethical?

More Related