1 / 54

Developing a Framework to Evaluate Training Programs Provided by WHO

Developing a Framework to Evaluate Training Programs Provided by WHO. The Feasibility of Incorporating Social Justice, Cultural Competency and Return on Investment A Work in Progress. Thank you to Athabasca University for funding this project. Dr. Joy H. Fraser. Background.

necia
Download Presentation

Developing a Framework to Evaluate Training Programs Provided by WHO

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Developing a Framework to Evaluate Training Programs Provided by WHO The Feasibility of Incorporating Social Justice, Cultural Competency and Return on Investment A Work in Progress

  2. Thank you to Athabasca University for funding this project Dr. Joy H. Fraser

  3. Background • Training requires increased financial resources and human resources • No comprehensive framework to evaluate return on investment (ROI) • Request from Office of Nursing and Midwifery, Department of Human Resources for Health, WHO

  4. Method • Participatory Action Approach • WHO Personnel Selected by Senior Scientist, Office of Nursing & Midwifery • Representation: WHO Priority Areas and Headquarters, Regional and Country Levels

  5. Participation of WHO • Interview and survey questions reviewed by key personnel interested in evaluation • Participants selected from: • Reproductive Health and Research • Making Pregnancy Safer • Expanded Program Pandemic Flu • Gender and Women’s Health

  6. Survey Questions focused on: • Program Planning and Design • Program Delivery & Teaching Methods • Evaluation/Monitoring

  7. Interview Questions • Evaluation methods currently used • How they were selected? • Strengths of current methods • What is missing? • Whether indicators of social justice, cultural competency and measures of ROI were included • What ought to be included in a framework?

  8. Challenges • Timing-Senior Scientist off work • Difficult to contact participants-busy or traveling • Lack of time for some participants-interviews and/or survey not completed • Discovery of other evaluation work being undertaken by WHO/UN

  9. Interviews • WHO personnel from Infectious Diseases, Gender, HIV/AIDS, Making Pregnancy Safer (9) • Country representatives (Jamaica and the Philippines) (3)

  10. Surveys: Head Quarters • Reproductive Health and Research (RHR) (FCH/STI and FCH/TCC) Controlling Sexually Transmitted and Reproductive Tract Infections • Making Pregnancy Safer (FCH/MPS) -Essential Newborn CareTraining • Integrating Gender into Public Health- Gender and Health Learning Program • Expanded Program Immunization Pandemic Flu–Bio-risk Reduction(CDS/EPR) EPI Training on Immunization in the African Region

  11. Surveys: Region and Country • AFRO-EPI-Training on Immunizations in the African Region • SEARO (Thailand)-ToT-Nursing Management of HIV/AIDS Prevention, Care and Support • Country-(Philippines)-ToT-Promotion of Healthy Lifestyles

  12. Social Justice Guiding Principles (Canadian Nurses Association) • Equity • Human Rights • Democracy and Civil Rights • Capacity Building • Just Institutions • Enabling Environments • Poverty Reduction • Ethical Practice • Advocacy • Partnerships

  13. Cultural Competency “A culturally competent professional is one who is actively in the process of becoming aware of his or her own assumptions about human behaviour, values, biases, preconceived notions, personal limitations and so forth. Second, a culturally competent professional is one who actively attempts to understand the world view of culturally diverse populations. Third, a culturally competent professional is one who is in the process of actively developing and practicing appropriate, relevant and sensitive intervention strategies and skills in working with his or her culturally different students” (Adapted from Sue & Sue, 1990).

  14. Assessing Cultural Competency May be assessed using indicators adapted from National Standards for Culturally and Linguistically Appropriate Services (Putsh et al, 2003. p. 10), Sue & Sue`s (1990) Attributes of a culturally competent professional (awareness, knowledge and skills) or Culhane-Pera`s (1997) Five Levels of Cultural Competency in Medicine or others.

  15. Cultural Competency “A lot of the people that come out of medical school, less from nursing school although even nursing school, have no information or have never heard of this topic, although less and less. So trying to get it into the under-graduate curricula…some these attitudinal and cultural competencies --not just having them in your post-graduate courses, but really getting them into the early trainings, so then you can really improve on it rather than having to start from scratch, with people that are already practicing.”

  16. Evaluation Generally defined as the “systematic acquisition and assessment of information to provide useful feedback about some object” (Michael Zinovieff quoting Bill Trochin, Cornell University, 2006).

  17. Evaluation Donald Kirkpatrick (1959,1998) “measuring changes in behavior that occur as a result of training programs.” Developed four Levels of training evaluation: reaction, learning, behaviour and results

  18. Types of Evaluation Formative Summative Confirmative Meta

  19. Formative • Focus on process • Improves the quality of the training during the design, development, and implementation stages • Carries out a subject matter expert review, a user review, or a pilot test. • When the design of the training program is near completion, both subject matter experts and users provide feedback to further refine the training

  20. Summative • Focus on final product • Determines the impact on individual and organizational performance during and after the training • Use direct observation, surveys of training stakeholders, measurement of performance indicators (quality, productivity, satisfaction, etc.) and/or measurement of the institutional “outcome”

  21. Confirmative Evaluation Future-oriented Focuses on verification of the continuous quality improvement of training programs. Looks at enduring, long-term effects or results over the life cycle of an instructional performance intervention-changes that can be identified after the passage of time and are directly linked to participation in training Level four of the Kirkpatrick evaluation model is confirmative evaluation Contains elements of outcome and impact evaluation

  22. Outcome evaluation - type of program evaluation that uses valued and objective person-referenced outcomes to analyze a program’s effectiveness, impact, or cost-benefit Impact evaluation - looks at negative or positive program-based changes in performance and focuses on whether the program has made a difference compared to either no program or an alternate program

  23. Meta Evaluation Quality control process that is applied to the processes, products and results of formative, summative, and confirmative evaluation Evaluating the evaluation (evaluator tries to figure out how the evaluation was conducted) Purpose is to validate the evaluation inputs, process, outputs, and outcomes Serves as a learning process for the evaluator and makes the evaluators accountable

  24. Evaluation Models • Kirkpatrick’s Four Levels • Phillip’s Return on Investment • Context, Input, Process and Product (CIPP)

  25. Levels of Evaluation Level Measurement Focus Questions Addressed 1-Reaction Trainees’ Perception What did they think of the training? 2-Learning Knowledge/Skills Gained Was there and increase in K/S? 3-Behavior Worksite Implementation Is new K/S being used on the job? 4-Results Impact on the Organization What effect did training have on the organization?

  26. Use of Levels • Level 1-most commonly used (Smile sheet)-easy to administer and evaluate • Level 2-used by academic centers, public sector, WHO-most reliable when pre-post-tests used • Level 3-difficult to measure human behavior and show evidence of it • Level 4-tied to measurable information related to bottom line

  27. Types and Levels of Evaluation Levels 1(Reaction) and 2 (K/S) –part of formative evaluation • Can lead to false sense of security • May be no relationship between feelings about training and improved performance

  28. Types and Levels of Evaluation Levels 3 and 4-associated with summative evaluation • Level 4 will determine whether it has value • Level 3 can be used to refine training

  29. Level 5 ROI • Justification of costs of training based on the return on investment and organizational impact • Requires collecting level 4 data • Converting results to monetary values • Comparing results to cost of training

  30. Return on Investment “Measuring return on investment is becoming a truly global issue. Organizations from all over the world are concerned about the accountability of training and are exploring ways and techniques to measure the results of training” (Jack Phillips, 1997, p. 4).

  31. Content, Input, Process and Product (CIPP) • Impact Evaluation-assess the reach to target audience • Effectiveness Evaluation-assess quality and significance of outcomes • Sustainability Evaluation-assess extent to which contributions are successfully institutionalized and continued over time • Transportability evaluation-assess extent to which program or training has been adapted or applied elsewhere

  32. Types of Evaluation Methods Used at WHO • Mainly level one and level two evaluations • Some examples of level three • No Return on Investment • Daily, Mid-way, Upon Completion • Upon Return to Worksite-Sporadically

  33. Satisfaction with Current Evaluation • Easy to Do • Low Cost • Immediate Feedback

  34. Themes/Concerns • Need to define training • Make training part of an educational process-include in pre-service and ongoing education • Better selection of trainees • Need for follow-up • Need more mentorship and support to use new knowledge, attitude and skills • Quality of educators/trainers • No capacity to do extensive evaluation

  35. Need to define training “If you look at a project proposal and you look at what anyone is doing, the first thing you have is development of guidelines, training programs, training modules, training curricula. And we need to revisit the whole issue of what we mean by training. Training is not a one-off entity and I think we have to acknowledge that. And we have to acknowledge that training requires follow-up and supervision--it's a holistic vision of what training can add to enable a person to perform better.” “I would like for us to first look at it as education-not training. And I would like us to see education as a component of programmatic management. And that we actually have to characterize it in a manner that it builds capacity over time. So, it is not just a one-off event. And that we should actually be innovative in the way we look at it and try to more inclusive”

  36. Need to Define Training “Do we mean in-service training, pre-service training-both? How much? In my area of work for example, we are working a lot with in-service training but this is not probably the most convenient approach. We would like –if countries are requiring in-service training in order to be with illusion of faster in implementing recommendations. The most sustainable approach is probably pre-service and this seems to be more complex. But it would be important to evaluate both and how training can be linked with a career path of the health care providers.”

  37. Make training part of an educational process-include in pre-service and ongoing education We are taking short-cuts and what we should be looking at is an education process. We are missing certain factors like personal growth, motivation. We are missing other factors like empowerment to apply. We are missing other factors like how does this link in with all the other training that's going on.

  38. Better Selection of Trainees “another big question I have is who's getting trained and how much, because what I see is there is a certain body of people who get trained and then there is this huge gap.” “I think we just evaluate the input and not the output. The output is the bigger picture. The output could be that you have 15 people trained. Well, but within the context of how many people actually need to be trained to do what and how”

  39. Need more mentorship and support to use new knowledge, attitude and skills How do we make this an educational experience in which we provide mentorship and follow-up afterwards, so that there is a continuum so that people can be helped or empowered to apply that knowledge in practice?… I see this as our biggest gap. We keep training people in a vacuum without looking at the environment and the infrastructure in which we wish them to work. What we need to do is enable them to be able to apply that knowledge in practice, and also to build on their knowledge and experience.

  40. Need for Follow-up • “… we are not looking at the longer-term picture of how the knowledge is being applied in practice… to adequately measure the uptake and application of skills”

  41. Quality of Educators “a major area of concern of mine for many years, and this particularly applies to nursing but it applies to many other fields as well--is the quality of the educators. And we don't put enough energy into ensuring that the educators are qualified and enthusiastic in the way that they teach.” “…if we could include into some of that evaluation, some of the more challenging issues that we need to address like the quality of the trainers, what is out there? And what is being missed? Not from the point of view of what is wrong but what do they need to improve or what to follow up?

  42. Evaluation needs to be practical “ We have to make evaluation incredibly practical and cost effective because it’s a problem fitting it in and its got to be something that you could actually use as a tool for ongoing planning. If you could get that into peoples' heads… I think people see evaluation as being the end of the road and not the beginning of it. And I think we've got to change the paradigm on this-maybe the wording.”

  43. Social Justice “We have a training program within the department (MPS) that is actually focused on a human right based approach to reproductive health--it used a rights based approach and that one actually has an evaluation framework which would actually include some of those indicators.”

  44. Cultural Competency “it would be interesting to look at it with a critical lens because we import many training programs. So it is the importation of the training and the training process that may not be culturally explicit. You see this for example for programs we've established for community health workers. When I go back to the times I was working with community health workers…where you lifted people out of their environment, took them to another, trained them and then didn't follow them up. Whether there are examples where you actually then take people in their environment, you've selected them based on the fact that you've actually assured of their continued existence within that community and trained them within the framework of that community, basically. So, if you are looking at it from that kind of cultural context then it is in-country where you really need to go, not at this level. But you also could have a look and see what people like us are recommending, because that also influences.. and in actual fact when you are looking at the training you should look at what are they recommending because that influences training.

  45. Social Justice and Cultural Competency Actually if you want to add both social justice and cultural issues is what is being done at the moment in the world of community health workers and use it as an example. Because if you've got HIV now, which is just focusing on community based health workers. But you've also got malaria, TB, and family planning-what are we actually doing and where are the drivers for this? Is it the countries or is it where the donors are pushing? And the time frames for training. I mean , a lot of the times you are constrained when you put forward an idea for training or education because people say , '"there is now way -we can spend that much time on training". But if you need that much to be able to produce a workforce, surely it's cost-effective.

  46. Learnings • When choosing a model it is essential to first identify the questions the evaluation needs to address • Evaluation needs to be practical • Need to account for the impact of intervening variables such as motivation to learn, trainability, job attitudes, personal characteristics, and transfer of training conditions • WHO needs to decide whether it is prepared to allocate the financial resources to carry out evaluation beyond level one and two

  47. Learnings (Cont’d) • Need a Long Term vs Short-term Approach • Need to have a plan and inform people about how/if results will be used • Need to develop capacity/competent trained people to deal with evaluation.

  48. Evaluate within Program Context “I see this as some basic evaluation principles in the way that we should actually support the vision and promote the vision of --you know if you want to build capacity well if you are going to evaluate that, it needs to be evaluated within the context of the program which is being provided”

More Related