can we be scientific in the practice of occupational health psychology an homage to don campbell n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Can we be scientific in the practice of occupational health psychology ? * An homage to Don Campbell PowerPoint Presentation
Download Presentation
Can we be scientific in the practice of occupational health psychology ? * An homage to Don Campbell

Loading in 2 Seconds...

play fullscreen
1 / 73
lala

Can we be scientific in the practice of occupational health psychology ? * An homage to Don Campbell - PowerPoint PPT Presentation

158 Views
Download Presentation
Can we be scientific in the practice of occupational health psychology ? * An homage to Don Campbell
An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. Can we be scientific in the practice of occupational health psychology? *An homage to Don Campbell Ted Scharf, Ph.D., Research Psychologist National Institute for Occupational Safety and Health Cincinnati, Ohio * Unceremoniously stolen from: Campbell, D.T. (1984). Can we be scientific in applied social science? In: Conner, R.F., Altman, D.G., and Jackson, C. (Eds.).Evaluation studies: Review Annual. v.9, 1984. Beverley Hills, Sage Publications, pp. 26-48.

  2. disclaimer – The findings and conclusions in this presentation have not been formally disseminated by the National Institute for Occupational Safety and Health and should not be construed to represent any agency determination or policy. Any findings and conclusions in this presentation are those of the author.

  3. please ask questions as we move along . . .

  4. Quasi-experimental methodology: • Campbell, D.T., & Stanley, J. (1966). Experimental and quasi-experimental designs for research. Boston: Houghton Mifflin Co. • Cook, T.D., and Campbell, D.T. (1979). Quasi-experimentation: Design and analysis issues for field settings. Chicago: Rand McNally. • Shadish, W.R., Cook, T.D., and Campbell, D.T. (2002). Experimental and quasi-experimental designs for generalized causal inference. Boston: Houghton Mifflin Co. Categories of validity: • statistical conclusion validity • internal validity • construct validity • external validity

  5. Campbell, D.T. (1984). Can we be scientific in applied social science? In: Conner, R.F., Altman, D.G., and Jackson, C. (Eds.). Evaluation studies: Review Annual. v.9, 1984. Beverley Hills, Sage Publications, pp. 26-48. contagious cross-validation competitive replication i.e. replication is the scientific response to methodological shortcomings or other problems with validity.

  6. Example: • Experimentally trained researchers tend to focus on the requirements of internal validity (e.g. requiring a “true” experiment) to the exclusion of concerns related to external validity. • Inappropriate use of a Randomized Controlled Trial (RCT): • CDC study regarding the prevention of transmission of HIV from birth mother to baby, in Côte d’Ivoire and Thailand, using: • reduced dosage of AZT, compared to a . . . • placebo control group, rather than to the U.S. standard of care • New England Journal of Medicine, v.337, no.12, September 18, 1997 e.g.: • Angell, M. The ethics of clinical research in the third world. pp. 847-849. • Lurie, P., and Wolfe, S.M. Unethical trials of interventions to reduce perinatal transmission of the human immunodeficiency virus in developing countries. pp.853-856.

  7. In Cook and Campbell notation, the CDC research design: • O1O2 X O3 O4 • - - - - - - - - - - - - - - - - - - - - • O1 O2 YO3 O4 • CDC design: X = experimental, reduced AZT protocol • Y = placebo • participants: HIV positive, pregnant women

  8. A “comparison” group instead of a “control” group: • O1O2 X O3 O4 • - - - - - - - - - - - - - - - - - - - - • O1 O2 YO3 O4 • - - - - - - - - - - - - - - - - - - - - • O1 O2 ZO3 O4 • Comparison groups design: • X = experimental, reduced AZT protocol • Y = U.S. standard AZT treatment • Z = AZT protocol, midway between X & Y

  9. Remember: • The “GoldStandard” - Randomized Controlled Trial (RCT): • • random selection of subjects / participants • • random assignment to experimental conditions • • a “no treatment” or “placebo” control group • • The origins of the RCT are in experimental and clinical medicine where physicians evaluate the efficacy of a particular drug or treatment • •Often described interchangeably as “evidence-based”

  10. Quasi – experimental methods: • • typically used with pre-existing, intact groups • measure and evaluate contributing or confounding factors • • between groups and within subjects analyses • • compare between different treatments • • origins of program evaluation methodology are in primary and secondary education

  11. Reminder: • When a new treatment is under test, AND . . . • There is no conclusive evidence that the new treatment is more effective than the current standard, THEN . . . • We test the new treatment on a sample of eligible subjects, AND • Deliver the standard (comparison) treatment to another, different sample • AND • If there is no known effective treatment,a placebo control group may be considered as a comparison group

  12. Victora, C.G., Habicht, J-P., and Bryce, J. (2004). Evidence-based public health: Moving beyond randomized trials. American Journal of Public Health, v.94, no.3, pp. 400-405. • clinical efficacy trials • public health regimen efficacy • public health delivery efficacy • public health program efficacy • public health program effectiveness

  13. Victora (2004) • plausibility evaluation to document impact and rule out alternative explanations, e.g. with a comparison group • complex intervention, RCT is artificial • large-scale demonstration required • ethical concerns preclude use of RCT • adequacy evaluation to document time trends • assessment of intermediate steps • evaluates each step in the presumed causal pathway

  14. Mohr, L.B. (1995). Impact analysis for program evaluation. 2ed. Thousand Oaks, CA., Sage. • “outcome line” – (especially ch.2, Fig 2.1, p.16) • preliminary, intermediate and long-term outcomes are modeled • other measured factors may influence the outcomes • figure below, adapted from Mohr (1995, p.16): Measured Activity #1 Measured Activity #2 Measured Activity #4 Measured Activity #3 Measured Outcome of Interest Measured Ultimate Outcome Measured Subobjective #2 Measured Subobjective #1

  15. Mohr, (1995), when: • series of related outcomes, • interim objectives, or sub-objectives, • formative evaluation required, • then: • attempt to measure all relevant influences in a study

  16. Disagreements between experimentally trained researchers and researchers trained in quasi-experimental social science methodology are just one example of the ways in which our work can be considered “unscientific.” Within NIOSH: Rosenstock, L. and Thacker, S. B. (May, 2000). Toward a safe workplace: The role of systematic reviews. American Journal of Preventive Medicine. Supplement. v.18, no.4S. Rivara, F.P., and Thompson, D.C. (Eds). , pp.4-5. and the reply: NORA Intervention Effectiveness Research Team, (May, 2001). May 2000 Supplement on preventing occupational injuries. Letter to the Editor. American Journal of Preventive Medicine. v.20, no.4. pp. 308-309. Theoretical perspective (a.k.a. “world views”) can exert a great influence on the conduct of the research.

  17. Altman, I., Rogoff, B., (1987). World views in psychology: Trait, interactional, organismic, and transactional perspectives. In: D. Stokols and I. Altman, (Eds.). Handbook of environmental psychology. v.1. New York: John Wiley and Sons.

  18. Example: The “classic” hierarchy of control. • I. Engineering Controls • A. eliminate the hazard • B. substitution of material, equipment, or process • C. isolation of hazard, e.g., barriers and/or removing the worker(s) • D. ventilation of airborne contaminants • II. Administrative Controls to reduce exposure • A. reduced work hours • B. employee education and training • 1. improved hazard recognition • 2. improved work practices • III. Personal Protective Equipment (PPE) • (Adapted from Raterman, 1996, and Office of Technology Assessment, 1985.)

  19. what else do we know about hazardous work environments ??

  20. Common features in hazardous work environments – constant change: • Variability in: time • space / location • motion • Characteristics or properties of workplace hazards: • force(s) creating or causing the hazard • types of efforts to control the hazard • traditional hierarchy of control • degree worker control • likelihood of failure of controls • predictability and salience • work process hazard • severity of risk, following exposure • interactions with other hazards

  21. Worker-centered approach to hazardous work environments: • Contrary to the traditional Hierarchy of Control: • 1) except where a hazard has been completely eliminated from the environment, worker control and participation in managing the hazard are essential; and • 2) when the work process is extremely time-limited ‑ or is an actual emergency ‑ workers are most likely to neglect their own safety to complete the emergent task.

  22. thus, especially in hazardous work environments, there appears to be an incompatible and conflicting set of demands that impinge on front-line workers (in particular): dual-attention demand: safety vs. productivity

  23. HOWEVER, from the point-of-view of OHP, our perspective on this problem must be: dual-attention demand: safety AND productivity How can we approach this problem? How can we train workers to adopt this perspective and attitude?

  24. brief digression: Aren’t we compromising safety when we permit considerations of productivity to enter into discussions of safety?

  25. brief digression: Aren’t we compromising safety when we permit considerations of productivity to enter into discussions of safety? Traditional workplace safety and health viewpoint: - economics and productivity never mentioned with respect to safety - to include economics is to balance a worker’s life in the same equation with the costs of production - fundamental principle: safety may not be compromised for any reason

  26. brief digression: Aren’t we compromising safety when we permit considerations of productivity to enter into discussions of safety? The real world: - safety is compromised every day on the job, especially in hazardous work environments - employees will take risks with their own lives to maintain production, (including in situations where they will not directly benefit) - especially when fatigued, attention to the production task becomes rote, and attention to changing hazards in the surrounding environment ceases

  27. brief digression: Aren’t we compromising safety when we permit considerations of productivity to enter into discussions of safety? NIOSH and others have come to realize that if we are truly interested in worker safety, we must develop realistic safety training that incorporates day-to-day productivity pressures into the training. By addressing safety in its real-world context, we: - enhance safety as a practical, usable, workplace skill - strive to incorporate safety into the production process, such that, “the safest way is also the easiest and most productive way.” (Susan Baker, Johns Hopkins)

  28. “what gets measured, • gets managed” • Professor Peter Chen • Colorado State University • University of South Australia • Orlando, FL., May 18, 2011

  29. Stokols, D. (1987). Conceptual strategies of environmental psychology. In: D. Stokols and I. Altman, (Eds.). Handbook of environmental psychology. v.1. New York: John Wiley and Sons. • and • Stokols, D. (1992). Establishing and maintaining healthy environments: Toward a social ecology of health promotion. American Psychologist. v.47, no.1, pp.6-22. • and • Stokols, D. (2006). Toward a science of transdisciplinary action research. American Journal of Community Psychology, v.38, pp.63-77.

  30. Establishing a contextual perspective; the core assumptions (Stokols, 1987, pp.42-43): 1. psychological phenomena should be viewed in the spatial, temporal, and sociocultural milieu in which they occur; 2. a focus on individuals’ responses to discrete stimuli and events in the short run should be supplemented by more molar and longitudinal analyses of people’s everyday activities and settings; 3. the search for lawful and generalizable relationships between environment and behavior should be balanced by a sensitivity to, and an analysis of, the situation specificity of psychological phenomena; 4. the criteria of ecological and external validity should be explicitly considered (along with the internal validity of the research) not only when: - designing behavioral studies, but also when: - judging the applicability of research findings to the development of public policies and community interventions.

  31. This is the search for and identification of the target phenomenon and the relevant contextual variables. The contextual variables may be identified through: - an exploratory and atheoretical process, or - a fully developed contextual theory

  32. Example: Florida Department of Health responses to the 2004 hurricane season

  33. Structural equation (two) models: 1. Both work organization and hurricane exposure measures. Purpose: to establish that the work organization measures contribute to a model in which the hurricane exposure measures are included as predictors. 2. Work organization measures alone. Purpose: to identify an upper-bound estimate for the effects of the work organization measures (i.e. without competing with hurricane exposure measures).

  34. prior hurricane training before 2004 W O R K O R G A N I Z A T I O N T O P I C S 2004 difficulty balancing work & family 2004 ill health 6/2005 amount of sleep 2004 USUHS hurricane exposure scale - 2004 return to normal 2004-2005 number of hurricanes worked 2004 distress during hurricanes 2004 prior hurricane experience before 2004 bad mental health days 6/2005 job dissatisfaction 2004-2005 emotional experiences of hurricanes 2004 hours worked 2004 presenteeism 6/2005

  35. p<0.001 p<0.01 p<0.05 hypothesized direction (sign of the coefficient) opposite the hypothesized direction (opposite sign of the coefficient) latent variable (construct) measured variable

  36. Structural equation (two) models: 1. Both work organization and hurricane exposure measures. Purpose: to establish that the work organization measures contribute to a model in which the hurricane exposure measures are included as predictors. 2. Work organization measures alone. Purpose: to identify an upper-bound estimate for the effects of the work organization measures (i.e. without competing with hurricane exposure measures).

  37. role conflict/ compatibility 2004 difficulty balancing work & family 2004 ill - health 6/2005 workload 2004 return to normal 2004-2005 distress during hurricanes 2004 social support 2004 bad mental health days 6/2005 safety conflict 2004 job dissatisfaction 2004-2005 control 2004 communication & work organization prob. 2004 presenteeism 6/2005

  38. The social ecology of health promotion core assumptions (Stokols, 1992, pp.7-8): 1. efforts to promote human well-being should be based on an understanding of the dynamic interplay among diverse environmental and personal factors; 2. analyses of health and health promotion should address the multidimensional and complex nature of human environments, including: - physical and social components - objective and subjective qualities - scale or immediacy (proximal vs. distal) to individuals and groups - independent environmental attributes or composite relationships among several environmental features;

  39. The social ecology of health promotion core assumptions (Stokols, 1992, pp.7-8), continued: 3. environmental scale and complexity: - individuals - small groups - organizations - populations i.e. multiple levels of analysis using diverse methodologies; 4. dynamic interrelations (or transactions) between people and environments: - physical and social features of settings influence participants’ health - participants modify their surroundings - interdependencies between immediate & distant environments, e.g. local, state, and national-level regulations for safety & health

  40. Scope of transdisciplinary research, (Stokols, 2006, p.66):

  41. What are the disciplinary boundaries of OHP ? Put another way, what are the most important disciplines with which OHP must interact ?

  42. Some candidate disciplines that are essential to OHP: health, industrial/organizational, community, and environmental psychology, plus epidemiology, public health, occupational medicine, industrial hygiene, safety engineering, and anthropology, sociology, economics.

  43. Stokols, D. (2006). Toward a science of transdisciplinary action research. American Journal of Community Psychology, v.38, pp.63-77. • and • Rosenfield, P.L., (1992). The potential of transdisciplinary research for sustaining and extending linkages between the health and social sciences. Social Science and Medicine. v.35, no.11, pp.1343-1357.

  44. something to ask your students: “which comes first . . . the question or the answer ?”

  45. and: “where do research hypotheses come from ?”

  46. brief review: quantitative methods – - test existing hypotheses (e.g., consider or rule-out) - assess concepts we have measured (quantitatively) - reduce observed results to manageable findings - enable systematic, replicable, and verifiable measurement, i.e. fundamental science quantitative methods do not – - generate novel explanations about things or events, e.g. propose new causal pathways - suggest explanations not previously measured

  47. qualitative methods – - describe and/or explain phenomena or events - interpret and/or “model” processes or events - may replicate and verify . . . or suggest unknown processes or relationships, at same time they provide empirical data to generate hypotheses or verify a quantifiably testable hypothesis qualitative methods provide data specific to a sample and target population from which it was derived series of qualitative interviews or focus groups produces an iterative and progressive investigation of the selected topic