1 / 46

Science and Methods in I-O Psychology

This chapter explores the methods of data collection, analysis, and the role of science in I-O psychology. It discusses the importance of objectivity, the use of scientific testimony in court, and the impact of research on HR decision making.

cbergeron
Download Presentation

Science and Methods in I-O Psychology

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Chapter 2 Methods and Statistics in I-O Psychology Royalty-Free/CORBIS

  2. Module 1: Science • What is science? • Science has common methods • Science is a logical approach to investigation • Based on a theory, hypothesis or basic interest • Science depends on data • Gathered in a laboratory or the field

  3. Common Methods (cont'd) • Research must be communicable, open, & public • Research published in journals, reports, or books 1) Methods of data collection described 2) Data reported 3) Analyses displayed for examination 4) Conclusions presented

  4. Common Methods (cont'd) • Scientists set out to disprove theories or hypotheses • Goal: Eliminate all plausible explanations except one • Scientists are objective • Expectation that researchers will be objective & not influenced by biases or prejudices

  5. Role of Science in Society • Expert witnesses in a lawsuit • Permitted to voice opinions about practices • Often a role assumed by I-O psychologists

  6. Daubert Challenge • Challenging testimony of an expert on the grounds it is not scientifically credible • Daubert v. Merrill-Dow, 1993 • Resulted in introduction of a method for distinguishing between “legitimate science” & “junk science”

  7. Scientific Testimony in Court • Theories presented in court must: • Be recognized by particular scientific area as worthy of attention • Be peer reviewed or subjected to scientific scrutiny • Have a known “error rate” • Be replicable or testable by other scientists

  8. Module 1 (cont'd) • Why do I-O psychologists engage in research? • Better equip HR professionals in making decisions in organizations • Provide an aspect of predictability to HR decisions

  9. Module 2: Research • Research design • Experimental • Random assignment of participants to conditions • Conducted in a laboratory or the field • Quasi-experimental • Non-random assignment of participants to conditions

  10. Research Design (cont'd) • Non-Experimental • Doesn’t include “treatment” or assignment to different conditions • 2 common designs: • Observational design • Survey design

  11. Methods of Data Collection • Quantitative methods • Rely on tests, rating scales, questionnaires, & physiological measures • Yield results in terms of numbers C. Borland/PhotoLink/Getty Images

  12. Methods of Data Collection • Qualitative methods • Include procedures like observation, interview, case study, & analysis of written documents • Generally produce flow diagrams & narrative descriptions of events/processes

  13. Quantitative & Qualitative Research • Not mutually exclusive • Triangulation • Examining converging information from different sources (qualitative and quantitative research).

  14. Generalizability in Research • Application of results from one study or sample to other participants or situations • Every time a compromise is made, the generalizability of results is reduced

  15. Sampling Domains for I-O Research Figure 2.1: Sampling Domains for I-O Research

  16. Control in Research • Experimental control • Influences that make results less reliable or harder to interpret are eliminated • Statistical control • Statistical techniques used to control for influences of certain variables

  17. Ethics • Ethical standards of the APA • Collection of 61 cases endorsed by SIOP • Illustrates ethical issues likely to arise in I-O psychology

  18. Module 3: Data Analysis • Descriptive statistics • Summarize, organize, describe sample of data Frequency Distribution: • Horizontal axis = Scores running low to high • Vertical axis = Indicates frequency of occurrence

  19. Describing a Score Distribution • Measures of central tendency • Mean • Mode • Median Ryan McVay/Getty Images

  20. Describing Score Distribution (cont'd) • Variability • Standard deviation • Lopsidedness or skew Ryan McVay/Getty Images

  21. Descriptive Statistics:Two Score Distributions (N = 30) Figure 2.2 Two Score Distribution (N=30)

  22. Two Score Distributions (N = 10) Figure 2.3

  23. Inferential Statistics • Aid in testing hypotheses & making inferences from sample data to a larger sample/population • Include t-test, F-test, chi-square test

  24. Statistical Significance • Defined in terms of a probability statement • Threshold for significance is often set at .05 or lower

  25. Statistical Power • Likelihood of finding statistically significant difference when true difference exists • Smaller the sample size, lower the power to detect a true or real difference

  26. Concept of Correlation Positive Linear Correlation Figure 2.4 Correlation between Test Scores and Training Grades

  27. Concept of Correlation (cont'd) • Scatterplot • Displays correlational relationship between 2 variables • Regression • Straight line that best fits the scatterplot

  28. Correlation Coefficient • Statistic or measure of association • Reflects magnitude (numerical value) & direction (+ or –) of relationship between 2 variables

  29. Correlation Coefficient • Positive correlation → As one variable increases, other variable also increases & vice versa • Negative correlation → As one variable increases, other variable decreases & vice versa

  30. Figure 2.6: Scatterplots of Various Degrees of Correlation

  31. Curvilinear Relationship • Although correlation coefficient might be .00, it can’t be concluded that there is no association between variables • A curvilinear relationship might better describe the association

  32. Curvilinear Correlation Figure 2.7 An Example of a Curvilinear Relationship

  33. Multiple Correlation • Multiple correlation coefficient • Overall linear association between several variables & a single outcome variable

  34. Meta-Analysis • Statistical method for combining results from many studies to draw a general conclusion • Statistical artifacts • Characteristics of a particular study that distort the results • Sample size is most influential

  35. Module 4: Interpretation • Reliability • Consistency or stability of a measure • Test-retest reliability • Calculated by correlating measurements taken at Time 1 with measurements taken at Time 2

  36. High and LowTest-Retest Reliability Figure 2.8 Examples of High and Low Test-Retest Reliability: Score Distributions of Individuals Tested on Two Different Occasions

  37. Reliability (cont'd) • Equivalent forms reliability • Calculated by correlating measurements from a sample of individuals who complete 2 different forms of same test • Internal consistency • Assesses how consistently items of a test measure a single construct

  38. Reliability (cont'd) • Inter-rater reliability • Can calculate various statistical indices to show level of agreement among raters • Generalizability theory • Simultaneously considers all types of error in reliability estimates

  39. Validity • Whether measurements taken accurately & completely represent what is to be measured • Predictor • Test chosen or developed to assess identified abilities • Criterion • Outcome variable describing important aspects or demands of the job

  40. Figure 2.9: Validation Process from Conceptual and Operational Levels Figure 2.9

  41. Criterion-Related Validity • Correlate a test score with a performance measure (validity coefficient) • Predictive validity design • Time lag between collection of test data & criterion data • Test often administered to job applicants

  42. Criterion-Related Validity (cont'd) • Concurrent validity design • No time lag between collection of test data & criterion data • Test administered to current employees, performance measures collected at same time • Disadvantage: No data about those not employed by the organization

  43. Content-Related Validity • Demonstrates that content of selection procedure represents adequate sample of important work behaviors & activities or worker KSAOs defined by job analysis

  44. Construct-Related Validity • Investigators gather evidence to support decisions or inferences about psychological constructs • Construct - concept or characteristic that a predictor is intended to measure; examples include intelligence and extraversion

  45. A Model for Construct Validity Figure 2.10 A Model for Construct Validity

  46. Construct Validity Model of Strength and Endurance Physical Factors Figure 2.11

More Related