1 / 24

The Need for Psychological Science

The Need for Psychological Science. The Need for Psychological Science Examples of Faulty Reasoning. Hindsight Bias “I-knew-it-all-along” phenomenon Overconfidence The tendency to think we know more than we do

davee
Download Presentation

The Need for Psychological Science

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Need for Psychological Science

  2. The Need for Psychological ScienceExamples of Faulty Reasoning • Hindsight Bias“I-knew-it-all-along” phenomenon • Overconfidence The tendency to think we know more than we do • Confirmation biasA tendency to search for information that confirms one’s preconceptions • False consensus effect The tendency to overestimate the extent to which others share our beliefs • Belief biasThe tendency for one’s preexisting beliefs to distort logical reasoning • Belief perseverance The tendency to cling to one’s conceptions after the basis on which they were formed are discredited

  3. Research Methods in Psychology • Descriptive Methods • Naturalistic Observation • Case Studies • Surveys • Correlational research (May include survey, interviews, tests, naturalistic observation, longitudinal, cross-sectional studies) • Experimental • Quasi-Experimental (no random assignment to condition) • SLG Distinguish among the various research methods and be able to identify the relative advantages and disadvantages of each

  4. Research Methods in Psychology

  5. Descriptive Research Methods • Naturalistic observation • Advantages • Avoids observer effect/reactivity (of subject) • Provides ideas for further research • Disadvantages • Potentially time consuming • No control of variables or over extraneous variables • Not replicable • Examples – Piaget, Naturalistic examples, Quasi… • SLG Discuss the relative advantages and disadvantages of naturalistic observation and provide an example of this type of research

  6. Descriptive Research Methods • Surveys, interviews, questionnaires and tests • Advantages • Relatively inexpensive, easy way of collecting large amounts of data (attitudes, interests, aptitudes) • Assuming a true random sample – generalizable • Disadvantages • Poor construction or administration of questions • Poor sample= unrepresentative (not generalizable) • Measures beliefs, not behaviors • Issues of self-report, memory and honesty • SLG Discuss the relative advantages and disadvantages of surveys/questionairres and identify an example of this type of research

  7. Descriptive Research Methods • Case studies – Of individuals, groups or phenomena • Advantages • Potentially, deeply revealing about individuals • Disadvantages • No experimental control • Sample size extremely small – generalizability? • Potential bias, both subject and experimenter • Examples, Phineas Gage, Freud and Little Hans, H.M. NPR • SLG Discuss the relative advantages and disadvantages of case studies and provide an example of this type of research

  8. Descriptive Research Methods • Archival Research • Advantages • Enormous amounts of data used to see trends relationships and outcomes • Disadvantages • No control over data collection or if reliable • Examples – Analysis of studies conducted by other researchers, or look at historical data (e.g. the Wild Child) • SLG Discuss the relative advantages and disadvantages of archival research and provide an example

  9. Research Methods • Longitudinal method Examples? Ads/Disads? • Cross-sectional method Advantages? • Cross-cultural method Purposes?

  10. Correlational Studies • Correlational studies look at the degree of relationship between variables and not the effect of one variable on another variable • Correlation DOES NOTequal causation. A relationship may be suggested, but it does not prove that one variable causes the other to change. For example, a correlational study may suggest a relationship between academic success an self-esteem, but it does not mean that academic success causes increases self-esteem… • C&Cheadlines • SLG Provide an example of a correlational study and explain why it does not prove causality • Distinguish between causal and correlational claims

  11. Scatterplots and Correlation • Correlation coefficient (Pearson-product moment correlation coefficent) measures 3 types • +1.00 = Positive (or direct) • -1.00 = Negative (indirect) • 0 = No correlation • SLG Distinguish between positive, negative and no correlation

  12. Correlational Studies

  13. Correlational Studies • Examples of Positive Correlation • 1. SAT scores and college those with higher SAT scores also have higher grades in college • 2. Happiness and helpfulness as people’s happiness level increases, so does their helpfulness • Examples of Negative Correlation • 1. Education and years in jail people who have more years of education tend to have fewer years in jail • 2. Crying and being held babies held less tend to cry more • SLG Distinguish between positive and negative correlation

  14. Correlational Studies - Problems • Illusory correlation detecting relationships where none exist(weather=cold). Other examples? • Third-Variable • Research showed a strong correlation between contraceptive use and number of electrical appliances in the home (Li, 1975).   Why?   • CorrelationMethodsReviewWS • SLG Provide examples of problems related to correlational claims

  15. Experimentation • Important Terms/Concepts. Most know…must know • hypothesis • independent/dependent variables • operational definitions (quantifiable) • population and random/stratified sample • representative sample • generalizability • experimental and control group (or condition) • random assignment • placebo use and effect • confounding variables • single and double blind procedures • statistical method/significance • replication

  16. Define your Population • Populationthe group researchers wish to study • All humans? • People with depression? • Adolescents? • SLG Distinguish between population and a sample

  17. Sampling • Sample a subgroup of your population • In order for results to be generalizable to the population, a sample must be representative (size is key) • Random sample everyone in the population has an equal chance of being in your sample • SLG Explain the relationship among the concepts of random sampling, representativeness, and generalizability

  18. Operational Definitions (for Variables) • Definitions should be clearly defined and quantifiable • Operational definitions reduce subjectivity and expectancy effects and allow for replication • SLG Explain what is meant by an operational definition. Provide an example of an operational definition for both the independent and dependent variable in a given experiment

  19. Independent and Dependent Variables • Practice in Identifying Variables • IDVDExercises • SLG Distinguish between independent and dependent variables

  20. Confounding (Hidden) Variables • Confounding variables Variables in a study that are not controlled for (outside factors, e.g.?) • Ways to control for confounding variables • Large sample size (more apt to be representative) • Random assignmentto groups (control and experimental) • Blinding - Single v.double • Single controls for reactivity (observer effects) • Double controls for expectancy effects (research bias) • Placebos or sham treatment • SLG Explain what is meant by a confounding variable IYOW

  21. Research pitfalls • Experimenter Bias • Self-fulfilling prophecy The experimenter arrives at conclusions that support his/her hypotheses based on the need to do so, not data • Halo effectsThe tendency for people to transfer a positive opinion based on irrelevant information, i.e., people tend to think that more attractive people are also smarter • SLG Provide an explanation for and examples of different kinds of experimenter bias

  22. Research pitfalls • Observer effect (aka reactivity) the effect the experimenter’s presence has on subjects • The Hawthorne effect is the tendency for change to occur simply because subjects are aware an experiment is being conducted • Social desirability bias is the tendency for subjects to be able to respond in an experiment in a way that they believe would be most socially desirable • SLG Define and provide an example of different kinds of observer effects

  23. Ethics in Experimentation • APA Requirements/Guidelines - Ethical Principles of Psychologists and Code of Conduct (2002) • Human experimentation must cause no harm • Informed consent • Confidentiality • Debriefing • Research institutions must have an Institutional Review Board (IRB) • Role of deception? (Baumrind) • SLG Know, describe and apply the APA ethical guidelines

  24. Animal experimentation • Controversies • Institutional Animal Care and Use Committees • Appropriate Beneficial and Caring (ABC) Guidelines • Issues of anthropomorphism, generalization, and anthropocentrism • SLG Explain the controversies related to animal experimentation in psychology

More Related