1 / 30

Psychological Research and Scientific Method

Psychological Research and Scientific Method. Experimental Method - Laboratory. The researcher controls as many variables as possible Who, what, where, when, how Can be conducted anywhere if it is controlled Eg) Bandura’s bobo doll study. Experimental Method – Field and Natural.

jeanettem
Download Presentation

Psychological Research and Scientific Method

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Psychological Research and Scientific Method

  2. Experimental Method - Laboratory • The researcher controls as many variables as possible • Who, what, where, when, how • Can be conducted anywhere if it is controlled • Eg) Bandura’s bobo doll study

  3. Experimental Method – Field and Natural • An experiment carried out in the real world • IV is manipulated by the experimenter • In natural experiments, the IV however occurs naturally, the experimenter just records the effect on the DV

  4. Studies using correlation analysis • Involves measuring the relationship between two or more variables to see if a trend exists • Positive correlation; one variable increases the other increases • Negative correlation; one variable increases, one variable decreases • Correlation coefficient; a number that expresses the degree to which the two variables are related +1 is a perfect positive correlation and -1 is a perfect negative correlation, (0,0) is no correlation

  5. Observational techniques Most psychological studies have some kind of observation included, it can be used in laboratory experiments, but it mainly used as the main research method • Participant observation; the observer becomes actively involved in the study, eg playing the games. This means the researcher gets a more hands on study • Non-participant observation; the researcher merely observes the activities

  6. Questionnaires Written method of data collection, they are a list of questions that are focussed on peoples behaviour, opinions and attitudes. There are two types of questions; • Closed and fixed questions • Open questions

  7. Interviews Questions in face-to-face situations • Formal and structured interviews; interviewer asks questions and writes down their responses. These interviews are identical for everyone • Informal and unstructured interviews; these are less controlled and involve an informal discussion, the direction is determined but the subjects are not. The interviewers need a great deal of training.

  8. Case Studies • Case studies allow researchers to carry out in-depth detailed investigation into individuals • Bromley argued that case studies are the “bedrock of scientific investigation”

  9. Experimental Design • Repeated measures – the same PP’s are tested in all conditions, this means there is no individual differences and uses less PP’s which means consistent results. However there can be order effects as PP’s know what is coming. • Independent groups design – different PP’s are used in each of the conditions, the PP’s are usually randomly allocated to each condition to try and balance any differences. There are no order effects in this instance, there are no demand characteristics but there are group differences. • Matched pairs design – different but similar PP’s are used in each of these conditions, the PP’s are matched to another PP in other groups, identical twins are the perfect matched pair. There are less group differences than in independent design, but matching is difficult and time consuming.

  10. The major features of science The key concepts of a science are; objectivity, control and manipulation of variables, replication and falsification Objectivity; taking an objective approach to an issue means having due regard for the known valid evidence Control and manipulation of variables; For ‘fair tests’ other variables must be controlled so can check IV and DV Replication; can it be repeated? Record methods carefully and use valid measures. Falsification; is the principle that a proposition or theory cannot be considered scientific if it does not admit the possibility of being shown false.

  11. Scientific processing Karl Popper’s Hypothetico-deductive model of science Popper said that theories and laws about the world should be used to generate expectations and hypotheses. The hypotheses can then be tested using experiments or other methods. This science does not rely on chance observation but on deliberately organising opportunities to make observations. • Making observations and producing facts (data about the world) • Constructing a theory to account for a set of related facts • Generating expectations (hypotheses) from the theory • Collecting data to text expectations • Adjusting the theory according to the data collected

  12. Scientific Process Developing Further: Alternative to the scientific approach New Paradigm research Idea that psychology is not in a social vacuum, people have to be interacted with to be studied. We should study the person rather than 1 aspect of behaviour (as done in experiments). New paradigm looks at social context, people as a whole, feelings and thoughts, aware that cannot always be objective

  13. Is Psychology a science?

  14. The validation of new knowledge • Publication – ultimate goal of research • Published in Journals • Peer review essential (should control for….) • Validity • Correct analysis of data • Accurate • Quality • Problems with peer review • Fabrication (data is made up) • Falsification (where data exists but has been altered) • Plagiarism (where work has been copied from others) • Issues: Gender and institution bias, file drawer phenomenon, research could be rejected if not in keeping with current theory

  15. The role of peer review • Scientists publish the results of their research in academic journals. These articles are subject to peer review (also known as ‘referring’) where psychologists read the article and judge whether the research is credible and valid. • Critics suggest that impartial review is an unachievable ideal because research is conducted in a social work and social relationships affect objectivity.

  16. Sampling strategies

  17. Implications of sampling strategies • Bias - Institution bias • Generalising • Representative samples • Demand characteristics • Investigator effects • Most research androcentric (conducted on men) • Most research Western (conducted in individualistic societies. • Also link to samples! Most volunteer or opportunity.

  18. Issues of reliability Reliability refers to the consistency of the research. Can the research be replicated? If the experiment was to be carried out again, would you get the same results? If two researchers saw the same thing, would they rate similarly? The more reliable research is, the more confidence we have in its findings. Types of reliability; • Internal reliability - (consistency of a measure within a test, usually assessed with split-half method) • External (able to replicate results of a study, usually assessed with test-retest method) Improving reliability • Inter-rater reliability (correlating scores) • Test- retest reliability • Average scores • Pilot test

  19. Assessing and improving validity (internal and external) Types and Assessment • Internal validity - (does it test the hypothesis?). Assessment: face, criterion (concurrent, predictive) • External validity - (eco valid, population) • Ecological validity Improving • Internal – control all confounding variables, Repeated measures – reduce by counter-balancing, double-blind technique • External – use a random sample • If study designed to test detail of a theory then high internal validity preferred • If study designed with intention of generalisation then external validity preferred.

  20. Ethics • Humans (BPS guidelines) • Informed Consent, Debriefing, Confidentiality, Right to Withdraw, Protection from harm • Care to be taken with socially sensitive issues ie. Race, social tension, homophobia etc. • Animals • Moral justification • Speciesm (just because we can talk does that mean we can control everything?) • Difficult to judge pain and emotion • Can give greater control and objectivity • Animal research is controlled and can be ethical

  21. Ethical considerations in design and conduct of psychological research Researchers in the UK much adhere to the guidelines of the British Psychological Society

  22. Probability and significance • If we want to know if men have better memories than women, we might take a selection of each gender and compare their scores. It is very unlikely that the scores will be identical, but what we want to know is whether the difference between the scores is significant. Basically if the researcher can conclude that there is a difference in memory capacity. • The idea of ‘chance’ is related to certainty, you can never be sure that an observed effect was due to chance or not, so psychologists state how certain you are. • In general psychologists use a probability of P≤0.05 – this means that there is a 5% possibility of results occurring if there is no relation between what is being tested. • The chosen value of P is the significance level

  23. Type 1 and type 2 errors • A type one error is a correct assumption that is mistakenly rejected, for example rejecting a null hypothesis that is true (so saying something is wrong when it is in fact right) • A type two error occurs when an incorrect assumption is mistakenly accepted, for example accepting a null hypothesis that is false (so saying something is right when it is wrong)

  24. Factors affecting choice of statistical test The way to find which statistical test you wish to use is by asking yourself the following questions; • Are your data in frequencies? If so use Chi-squared • Are you looking to find out if your two samples are different or correlated? If you are seeking to find if two samples are different use either Mann-Whitney or Wilcoxon. If you are looking for correlation then you use Spearman’s rho • Experimental design; are your samples related (eg a repeated measures design is used)? If they are related use Wilcoxon, if they are independent use Mann-Whitney. • Level of measurement? • Nominal – data are in categories • Ordinal – data are ordered in some way • Interval – data are measured using units of equal intervals • Ratio – there is a true zero point

  25. The use of inferential analysis

  26. Quantitative VS Qualitative

  27. Analysis and interpretation of qualitative data - graphs • Histogram • Bar Chart • Correlation – be able to read a correlation co-efficient • Closer to 1 strong the correlation • Closer to 0 weaker the correlation • Minus tells you it is negative, nothing before the number tells you it is positive.

  28. Analysis and interpretation of qualitative data • Central tendency • Mean , median, mode • tells us the middle/most frequently occurring values. • Used to compare data from 2 sets of scores i.e. 2 groups/conditions • Measures of dispersion • Standard deviation and range. • Describe the spread of scores/how much variance around a central score. • Semi-interquartile range • Standard deviation

  29. Conventions of reporting on psychological investigations The convention of psychological reports, is to divide them into following sections; • A short summary or abstract at the beginning. This provides key details about the aims, PP’s, research methods and procedures, findings and conclusions • An introduction outlining previous research which was led the researcher to their hypothesis • A description of the method and the procedures, this should be detailed so that somebody could replicate the study in the future, any questionnaires or tests used should be found as an appendix • The results are described and summarised from the raw data, other descriptive statistics are included in the write up, and finally the inferential statistics indicating the significance of the results. • A discussion of the results, including references to other studies and suggestions of alterations.

  30. Basic sections of report • Title • Abstract • Introduction • Aim/hypothesis • Method (design, procedure, participants, materials) • Results • Discussion • References • Appendices

More Related