1 / 33

Contribution of Survey to Health Systems Performance Monitoring: experience with the World Health Survey

Contribution of Survey to Health Systems Performance Monitoring: experience with the World Health Survey. THE MALAYSIAN EXPERIENCE 28-29 September 2006 Montreux, Switzerland. Introduction. World Health Survey 2002 Nationwide community survey

kermit
Download Presentation

Contribution of Survey to Health Systems Performance Monitoring: experience with the World Health Survey

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Contribution of Survey to Health Systems Performance Monitoring: experience with the World Health Survey THE MALAYSIAN EXPERIENCE 28-29 September 2006 Montreux, Switzerland

  2. Introduction • World Health Survey 2002 • Nationwide community survey • Multistage stratified sampling representative of population • Stratified for state & urban rural location • National & rural/urban location estimates • Where possible estimates across various socio-demographic variables • Institutionalised population excluded (<3%)

  3. Introduction • World Health Survey 2002 • Data collection early March – mid April 2003 • 200 personnel of various categories including temporary research assistants • MOH facilities & vehicles • Nationwide publicity

  4. Field Preparation • Organisational structure • Advisory/Steering committee • Central Research Team • Field Data Collection team • Data Entry Team

  5. ORGANISATIONAL STRUCTURE

  6. Implementation strategy More rural Less rural Less densely populated areas high density areas (larger sample size) Very highly density areas (larger sample size) Mobilization of survey teams across districts & states

  7. Survey implementation schedule • Budget proposal (Oct-02) • Translating & Pre-testinginstruments(Oct-Nov 02) • Road shows(16-20 Dec 02) • Recruitment of research assistants(Jan-Feb 03) • Field preparation- (sampling & procurement)(Jan-Feb 03) • Identification of EBs and Tagging exercise(Jan-Feb 03) • Training(17 Feb- 15 Mar 03) • “Launching”(28 Feb 03) • Data collection(Mac-April 03) • Publicity in various media(Feb – April 03) • Data entry(Mac-April 03)

  8. Survey implementation schedule • Presentation of preliminary findings(July 03) • Programme heads and service providers • Share contents of WHS 2002 • Identify additional questions relevant for programme needs • Further assistance with analysis from WHO (July 05) • Mini-conference (September 2005) • Invited resource person from WHO • Senior officers from programmes and various operational level • Clinicians, public health specialists, public health engineers, nutritionists, human resource personnel

  9. Survey implementation schedule • Report writing (October 2005 – June 2006) • 5 volumes • 4 drafts • 3 volumes already with printers (August 2006) • Proposed presentation to senior management (Nov – Dec 2006)

  10. WHS 2002 Snapshot of Data Quality

  11. WHS 2002 • Sample size = 7528. • Response Rate = 80.2% • Analysis (as per WHO) done in 2005.

  12. Sampling

  13. Response Rate

  14. HH Sample Deviation Index

  15. Individual level Sample Deviation Index

  16. Missing Data

  17. Reliability

  18. HH Level: Sociodemographic Profile(weighted) • Mean Household size = 4.2 • Male : Female ratio = 0.98 • Geographical location: Majority in urban areas

  19. HH Level: Sociodemographic Profile(weighted)

  20. Presentation & Utilisation of findings • To date, results yet to be formally presented to top management • General impression of findings • Value added as it provides a better perspective (new dimensions) of country HS performance • Better performance in some aspects but “eye-opener” in others! • Application found downstream at various levels

  21. Utilisation of findings • National level • Key input into development of National Health Financing Mechanism • Volume 5 (Responsiveness section) • Areas respondents were not satisfied • State of hospital • Utilisation pattern • Cost of care • Volume 4 (H Expenditure) • OOP, perception on risk pooling • Volume 3 (Coverage) • understanding of current situation of service provision • Volume 2 (Risk factors) • What should go into the basic benefit package • ANC, HIV transmission amongst mothers, condom use for prevention

  22. Utilisation of findings • Programme level • Responsiveness component used in development of “soft skills” training modules for health workers & evaluation of front line customer services • Input into development of Patient/People - centred services • Verify effectiveness of current programmes and related activities • Support (evidence-based) the justification of newly introduced activity • Recommenddevelopment of new strategies/activities to specific risk groups • Identification of new research to look into impact

  23. Potential application • Evaluation of Mid term review of performance of 9th Malaysia Plan

  24. What we have learnt….. • Objectives have been achieved • Contribution to development of cross-country measure • Assess country HS performance • Transfer of technology (to some extent!)

  25. What we have learnt….. • Costly affair • Human & financial resource intensive • Need for “buy-in” from all sectors • To ensure successful survey implementation • To ensure usage of findings

  26. What we have learnt….. • Need for advanced planning (at least 1yr!!) • Negotiation with operational managers for manpower assistance & other logistics • Budget proposal and approval

  27. What we have learnt….. • Instrument itself • Translation into local language poses a challenge! • Complicated & lengthy (2.5 hrs mean completion time) • Some aspects politically sensitive (8000 series were omitted) • Definitions of certain variables differ from own country • Uneasiness & defensive about country performance by various programme heads • E.g. HIV & Human resource • In retrospective • Should allow for 2 sets of definitions

  28. What we have learnt….. • Analysis & Interpretation • Stata software • Limited expertise • Complex analysis • CHOPIT (done by WHO team, Geneva) • Some analysis still pending • E.g. Y star results for adjustments for cross country comparison (responsiveness section still pending) • Duration of whole activity (data collection to analysis to report writing) • Too long

  29. What we have learnt….. • Lack of expertise to translate research findings into action • e.g. interpreting findings, writing policy briefs

  30. Our conclusion… • In general, • WHS 2002 a useful tool for management • Provides good info/added value about our HS performance not found in routine M & E • But…. • A costly affair • A painful exercise (blood, sweat & tears!) of negotiations, personal sacrifices, energy sapping, etc • And….. • Success requires careful planning

  31. Our recommendations…. • Have sufficient budget for implementation • Ensure top management commitment • Allow countries to adopt & adapt sections applicable to them • Need to simplify • vignettes • health status • We have the mean score but as there is no benchmark, results not really meaningful • Need to make instrument brief • Assist countries without capacity to undertake national community surveys

  32. Our recommendations…. • Assist countries to market findings to policy makers • Translation into policies • Help to see implications of findings to current policies • Need to build greater “in-country” capacity from beginning to end

  33. Thank you

More Related