1 / 64

Dr. Charles C. Chan

Conciliating scientific rigor & pragmatics in Outcome Evaluation Research for human service professionals Theoretical underpinnings and implementation techniques Network for Health & Welfare Studies Dr. Charles C. Chan Dr. Amy P. Y. Ho Mr. Kevin Chan February 25, 2004. Dr. Charles C. Chan.

zihna
Download Presentation

Dr. Charles C. Chan

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Conciliating scientific rigor & pragmatics in Outcome Evaluation Research for human service professionalsTheoretical underpinnings and implementation techniquesNetwork for Health & Welfare StudiesDr. Charles C. ChanDr. Amy P. Y. HoMr. Kevin ChanFebruary 25, 2004

  2. Dr. Charles C. Chan Convenor Network for Health & Welfare Studies Associate Professor Department of Applied Social Sciences

  3. Applied research can be of better service to the human service industry, especially in a quality conscious era, if it promises to pay balanced attention to the dialectical relationship between the professional / researcher and the service recipient. Preamble

  4. Strange bed-fellows or estranged couples? Quality and Effectiveness :

  5. The overall all goal of placing priority on both quality and effectiveness of health programs is to document the synergistic contribution of its component parts to a holistic improvement in health and welfare of the population. WHO-EURO Working Group on Health Promotion Evaluation (1998) Quality and Effectiveness :

  6. The focus should not be on documenting quality or evaluating effectiveness of isolated interventions as end points, but rather on the relationship of a given intervention to the other components of the health promotion strategy. Such an analysis may indeed provide a fresh approach to the very issue of quality. Quality and Effectiveness :

  7. BARD is a tested approach in applied research to address the demands of scholarship of application and community benefits It recognizes problems central to applied research It contributes by directing decision making in the research process Definition of the Balanced Applied Research Development Model (BARD):

  8. Placing value on a balanced approach to the scientific requirements and the pragmatics of outcome evaluation in human services Main features of BARD:

  9. Emphasizing integrating research into practice by transferring skills in data capturing to professionals or lay persons in service delivery Main features of BARD:

  10. Analyzing as much as possible, the triadic relationship between the professional, lay person, and service recipient and sustaining intervention effects through supportive supervision Main features of BARD:

  11. Efficacy – Intervention does more good than harm under optimum conditions Effectiveness - Intervention does more good than harm under real-world conditions (Flay, 1986) Efficacy Vs Effectiveness

  12. Reach Efficacy or Effectiveness Adoption Implementation Maintenance and cost The RE-AIM framework (Glasgow, 1999)

  13. The RE-AIM framework (Glasgow, 1999) - Cont’d

  14. Health Services Research Fund (1995 - 2003) Health and Health Services Research Fund (2003 - present) Research Fund on Control for Infectious Diseases (2003 – present) Community Investment & Inclusion Fund (2003 - present) Outcome Evaluation Research in Hong Kong – Financial Landscape

  15. Capital amount of $55M Total number of submission: 1096 Total number of approved HSRC projects=224 Health and Services Research Fund (1995-2003)

  16. Capital amount of $10M Total number of submission: 187 Total number of approved projects=4 (ESGAA meeting in July 2003) Health and Health Services Research Fund (2003 - present )

  17. Capital amount of $ 500M (50M given to PRC’s Ministry of Health for SARS research) Open call and commissioned research activities: tentative 30M to HKU in basic and epidemiological modeling research & 25 M to CUHK on public health and emerging infectious diseases drug and treatment development research Leaving sums for the establishment of the Centre for Health Protection Research Fund on Control for Infectious Diseases (2003 – present)

  18. Community Investment & Inclusion Fund - Funding information

  19. Community Investment & Inclusion Fund - Funding information -Cont’d

  20. Dr. Amy P. Y. Ho Member Network for Health & Welfare Studies Senior Lecturer Department of Applied Social Sciences

  21. “Promoting health and well-being of elderly patients with chronic illness: A coordinated medical and social service program” Funded by the Health Care & Promotion Fund OER in human service setting – An example

  22. Intervention flowchart

  23. Intervention components

  24. Feasibility of blinding in human service research Artifacts introduced by confounders in unblinded study Issues in unblinded study

  25. Contrasting the notion of “experimental group” in the context of medical research and human services research “Black Box” phenomenon in health promotion research (Moore, 2003) Standardization of intervention

  26. Between-group imbalance in the control of extraneous variables Incorporating extraneous variables in the analysis Control on extraneous variables

  27. Staff turnover Artifact due to unblinded group assignment Reliability of self-reported outcomes Data quality and reliability

  28. Loss to follow-up Causes of participants attrition in human services OER Effect of attrition on data analysis Issues in data analysis

  29. Mr. Kevin Chan Research Associate Network for Health & Welfare Studies Department of Applied Social Sciences Honorary Scientific Officer The Hong Kong Childhood Injury Prevention & Research Association

  30. OER in human services are more than often violating the assumptions required in traditional RCT convention Such difference should not be viewed as a deviation from the conventional approach, but rather a calling for applied research with stronger ecological validity Presumption of the experimentalist ideal (Smith, 1985)

  31. The difficulties associated with the random assignment The ethical and administrative objections to randomization (de_Raeve, 1994) That inputs are rarely stable That awareness of difference introduces bias That it is almost impossible to exclude extraneous variables The research is unable to say why changes detected have occurred (Newell, 1992) In fulfilling requirements set for RCT, we often undermine the following (Rolls, 1999):

  32. Dealing with less-than-perfect random assignment Adjusting for unstable input in human services OER Statistical control for unblinded studies The inclusion of extraneous variables as covariates Narrowing down to a closer approximation of causal relationship Troubleshooting to barriers – OER in human service setting

  33. Balancing incentives for follow-up of control group Intention-to-treat analysis As-treated analysis Complier-Average Causal Effect (CACE) analysis Troubleshooting to barriers – Group assignment in OER for human service setting

  34. Balancing incentives for follow-up of control group Despite the lack of intervention, control group participants could be prescribed with “non-specific” intervention or substantial support to maintain their interests to comply with the data collection process Troubleshooting to barriers – Group assignment in OER for human service setting – Cont’d

  35. Balancing incentives for follow-up of control group “Non-specific” intervention -E.g. Delivery of printed health promotion materials Substantial support -E.g. Safety device, health supplements Troubleshooting to barriers – Group assignment in OER for human service setting – Cont’d

  36. Troubleshooting to barriers – Group assignment in OER for human service setting – Cont’d

  37. Complier-Average Causal Effect (CACE) analysis (Angrist, 1996; Little, 1998) Expanding the conventional regression equation with a new term – compliance with intervention Troubleshooting to barriers – Group assignment in OER for human service setting – Cont’d

  38. Complier-Average Causal Effect (CACE) analysis (Angrist, 1996; Little, 1998) Difference of evaluation outcome (e.g. score on SF-36, probability of sustaining home injury) between compliers and non-compliers, with other missing information (e.g. covariate variables such as age, gender, health status, attitude toward the research question) adjusted and stratified by compliance category. Troubleshooting to barriers – Group assignment in OER for human service setting – Cont’d

  39. Complier-Average Causal Effect (CACE) analysis (Angrist, 1996; Little, 1998) CACE accounts for potential interaction between randomized group assignment & compliance and allows more accurate estimate of program effectiveness by: Increasing number of “usable” cases “Subtract” the artifact of compliance that moderates the outcome variable Troubleshooting to barriers – Group assignment in OER for human service setting – Cont’d

  40. Explicit definition of intervention protocol Behavioral intervention Substantial input Personnel input Temporal data (Date, time, duration, frequency) Define boundary of intervention Troubleshooting to barriers – Adjusting for unstable input in human services OER

  41. Troubleshooting to barriers – Adjusting for unstable input in human services OER

  42. Troubleshooting to barriers – Adjusting for unstable input in human services OER

  43. Troubleshooting to barriers – Adjusting for unstable input in human services OER

  44. Troubleshooting to barriers – Adjusting for unstable input in human services OER

  45. Process analysis Process evaluation fills up the “Black Box” (Moore, 2003) left void in health promotion research and strength internal validity of the intervention under investigation. Troubleshooting to barriers – Adjusting for unstable input in human services OER – Cont’d

  46. Process analysis Process analysis identifies variance in protocol implementation & participants’ receptiveness to the prescribed intervention. A process analysis covers Count of activities attended Perceived effectiveness of the intervention Compliance to intervention protocol Met and unmet needs related to the intervention Troubleshooting to barriers – Adjusting for unstable input in human services OER – Cont’d

  47. Process analysis (Cont’d) The information gathered from the process research should be integrated into the outcome research, rather than setting aside as auxiliary document in the discussion of study results Troubleshooting to barriers – Adjusting for unstable input in human services OER – Cont’d

  48. Quality assurance exercise Organizational requirement Staff involvement Troubleshooting to barriers – Quality Assurance in OER for human service setting

  49. Unanticipated effect of health promotion on social capital and cohesion (Raphael, 2000) Economic analysis (including Cost-effectiveness Analysis or Cost Benefit Analysis) Maintenance and cost in OER

  50. Conclusion

More Related