1 / 18

MSc Applied Psychology PYM403 Research Methods

MSc Applied Psychology PYM403 Research Methods. Validity and Reliability in Research. Validity and Reliability in Research. Aims of session are to consider

orea
Download Presentation

MSc Applied Psychology PYM403 Research Methods

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. MSc Applied PsychologyPYM403 Research Methods Validity and Reliability in Research

  2. Validity and Reliability in Research • Aims of session are to consider • ·             Internal / external validity·             Threats to internal validity ·             Considerations and evaluation of external validity • ·             Effective research design and implementation ·             Critiquing a journal article

  3. Validity and Reliability: Indicative Reading Books on conducting Research Methods, e.g., Breakwell, G. M., Hammond, S., & Fife-Shaw, C. (1999). Research Methods in Psychology (2nd ed). London: Sage. Internet sites, e.g., http://www.socialresearchmethods.net/kb/contents.php - Validity http://www.socialresearchmethods.net/kb/intval.php - Reliability http://www.socialresearchmethods.net/kb/reltypes.php

  4. Internal / External Validity • Internal Validity • Extent to which we can conclude a causal relationship between two variables • External Validity • Extent to which we can generalise from sample, settings, variables manipulated, and variables measured • Research as tension between Internal / External validity

  5. Internal Validity • - “True” experiment better able to conclude IV has effect on DV • - Features of true experiment include manipulation of IVs, randomisation techniques, control of other variables • - “Correlation does not imply causation” • - What about other quantitative designs looking for “differences between groups/conditions” and hoping for causal explanation? • - Various threats to internal validity when design not true experiment - can mean rival hypotheses - may or may not be plausible • - Your job as researcher to spot them and discuss • - Despite limitations, still very important to carry out studies “in the field”

  6. Threats to Internal Validity (Single Group) • History – events between 1st and 2nd measurement • Maturation – participant changes over time per se (i.e. not events) e.g., older, more hungry, less motivated • Testing – effect of taking test once on taking the scores a • second time • Instrumentation – changes in instrument or observers / • scorers over time • Statistical regression – particularly when groups selected on • basis of extreme scores (selection bias)

  7. Threats to Internal Validity (Multiple Groups) • Selection bias – differential selection of respondents for comparison groups (self-selected groups can be problem) • Sources of bias interact • e.g., Selection history threat • Selection maturation threat • Selection testing threat • Selection instrumentation threat Selection regression threat • Mortality – differential loss of participants from groups

  8. Threats to Internal Validity (Social Interaction Threats) • Diffusion of Treatment – “second hand training” • Compensatory Equalization of Treatment • - Compensatory Rivalry • Demoralization Effects • Local History – difference in conditions in which groups were tested

  9. External Validity • - Research would ideally maximise both Int & Ext - not always possible. • In general, true experiment will maximise internal validity, probably at expense of external validity, while field study or observational research will gain in external validity, losing some internal validity. • Still very important to carry out real-life empirical research • – just need to be aware of any limitations in interpretation

  10. External Validity: Considerations • Sample Modelling – you first identify the population you want to generalise to and then draw a sample from that population • Proximal Similarity (Gradient of Similarity) – identify factors that are more or less similar to the original study • Threats to External Validity • Unusual people • Unusual places • Unusual times

  11. Proximal Similarity

  12. External Validity: Evaluation • Population Selection - generalizability? • Operational Definitions - IVs / DVs operationally defined? • Parameter Values - Are there reference groups / norms? • Pretest - Could a pretest have influenced performance? • - Demand Characteristics • Hawthorne Effect - were groups made to feel special? • Pygmalian Effect - Were subtle cues given?

  13. Reliability • Inter-Rater or Inter-Observer ReliabilityUsed to assess the degree to which different raters/observers give consistent estimates of the same phenomenon. • Test-Retest ReliabilityUsed to assess the consistency of a measure over time. • Parallel-Forms ReliabilityUsed to assess the consistency of the results of two tests constructed in the same way from the same content domain. • Internal Consistency ReliabilityUsed to assess the consistency of results across items within a test.

  14. Orienting Questions • Try to identify the design of the main study. • Is it a “true experiment” or not? • What are the variables, and are they manipulated? • What threats to validity? Do they try to address these? • What other questions addressed or other statistics used? • What are the good points? Any criticisms?

  15. Critiquing Research • Purpose is critical evaluation • 1. Need to read with understanding • 2. Analyse content • 3. Evaluate • - Should be constructively critical (not simply negative) • - Appreciate contributions to psychology • Consider methodology and ingenuity of design • Should be objective • Use third person, avoid personal opinions

  16. Critiquing Research: Structure - Begin by summarising research (very briefly) - background, methodology, findings, conlcusions - Then critique the work - Then briefly summarise and make your conclusions

  17. Critiquing Research: Considerations • Methodology • Internal / External Validity - threats to these? • - Appropriate design, sampling, operationalization of variables, procedure, data collection, materials, etc. • Reliability - were the measures reliable? • Contributions to current psychological knowledge • How does it fit history of research in this area? • What does it add to practical /theoretical aspects? • How important is the study in the wider world?

  18. Critiquing a Journal Article • Practice paper for critiquing • Rosenthal, R. & Fode, K. L. (1963). The effect of experimenter expectation on the performance of the Albino Rat. Behavioral Science, 8 (3), 183-189.

More Related