1 / 31

Introduction to Research & Evaluation

Introduction to Research & Evaluation. Module 2 Elaine A. Borawski, PhD. Module Overview. Defining research and evaluation What makes a study “research” Roots of participatory research Applied/basic Qualitative/quantitative Program Evaluation Example.

gefjun
Download Presentation

Introduction to Research & Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Introduction to Research & Evaluation Module 2 Elaine A. Borawski, PhD

  2. Module Overview Defining research and evaluation What makes a study “research” Roots of participatory research Applied/basic Qualitative/quantitative Program Evaluation Example

  3. How do respond when you hear the word research or evaluation?

  4. Academics use the word research to describe what they do More often than not professional people working in community organizations/health departments say “We don’t do research,” or “Yes we collect data, but it’s not research.”

  5. Is it the different between research and evaluation? Or is it the difference between research and non-research. Different views.

  6. Research vs. Non-Research Sometimes federal grants will say “application cannot include research” What does this mean? Different distinction than research vs. evaluation, as evaluation can be considered research CDC document (handout)

  7. CDC Definition of Research The regulations state that "research means a systematic investigation, including research development, testing and evaluation, designed to develop or contribute to generalizable knowledge" (45 CFR 46.102(d).

  8. Is it the Methods That Makes it Research? Obtaining and analyzing data are essential to the usual practice of public health. For many public health practice activities, data are systematically collected and analyzed. Scientific methods are used in both public health research as well as public health practice activities. Knowledge is generated in both cases. Furthermore, the extent to which knowledge is generalizable might not differ greatly in research and non-research. Thus, non-research and research activities cannot be easily defined by the methods they employ.

  9. Federal definitions of research The major difference between research and non-research lies in the purposeof the activity. The purpose of research is to generate or contribute to generalizable knowledge. The purpose of non-research (in public health) is to prevent or control disease or injury and improve health, or to improve a public health program or service.

  10. Federal definitions of research However, knowledge might be gained in any public health endeavor designed to prevent disease or injury or to improve a program or service. In some cases, that knowledge might be generalizable, but the purpose of the endeavor is to benefit clients participating in a public health program or a population by controlling a health problem in the population from which the information is gathered.

  11. Furthermore Other attributes, such as publication of findings, statutory authority, methodological design, selection of participants, and hypothesis testing or generating, do not differentiate research from non-research, because these types of attributes can be shared by both research and non-research activities.

  12. CDC Definition of Research The regulations state that "research means a systematic investigation, including research development, testing and evaluation, designed to develop or contribute to generalizable knowledge" (45 CFR 46.102(d).

  13. Program Evaluation:Non-Research Examples Evaluation of School-based HIV Prevention Program - As part of the evaluation of the school-based HIV prevention program in Denver public schools, principals, teachers, student contact staff, students, and parents were interviewed. HIV program efforts in policy awareness, staff development, curriculum implementation, and status of students receiving HIV prevention education were assessed. The purpose (primary intent) of the program evaluation was to provide information to Denver public schools that will be used to improve their school-based HIV prevention programs. The results from the evaluation were used to assess the success of the interventions in a specific population (Denver public school children) and to refine the interventions in that population.

  14. Program Evaluation:Non-Research Examples IMPACT Progress Reports - The Office on Smoking and Health awarded 32 states and the District of Columbia health departments cooperative agreements to build capacity to conduct tobacco use prevention and control programs. These cooperative agreements are part of CDC’s nationwide effort to establish comprehensive, coordinated tobacco use prevention programs. Evaluation of IMPACT is comprised of awardees submitting semi-annual progress reports. Information in the evaluation includes staffing, coalition composition and efforts, status of a state tobacco control plan, development of a resource center, training efforts, community outreach and mobilization, and participation in CDC national campaigns. The primary intent of these state tobacco control program evaluations is to assess the success of the intervention activities within each state. The information gained from the evaluation is used to refine the interventions in that state. In addition, the information is used nationally to evaluate the success of the IMPACT program.

  15. Program Evaluation:Research Examples Evaluation of Community Based Organization Intervention to Reduce Sexually Transmitted Disease (STD) Rates Among STD Patients in Miami - Male STD Patients were randomized to either the standard HIV prevention counseling or intensive counseling comprised of four sessions of HIV counseling from a community based organization. STD clinic records were reviewed to determine whether there was a difference in return rates with new STDs between the groups. The objective of intervention and evaluation is to determine whether intensive counseling reduces the acquisition of new STDs among high risk people attending a STD clinic. The purpose of the project was to evaluate a new intervention for reducing the transmission of STDs. Knowledge gained from this evaluation would be used to generalize to other sites.

  16. Program Evaluation:Research Examples A Comprehensive Evaluation for Project DIRECT (Diabetes Intervention: Reaching and Educating Communities Together) - Project DIRECT is a community diabetes demonstration project targeting African American adults residing in Raleigh, North Carolina. The project is three-tiered and addresses diabetes care, community screening for persons at high risk for developing diabetes, and population based approaches to increase physical activity and reduce dietary fat intake (two risk factors for diabetes). The goals of the community project are to reduce preventable complications of diabetes via a health systems approach, increase the proportion of persons at risk for diabetes who are screened, and increase the proportion who participate in regular vigorous physical activity and eat a reduced fat diet. Baseline and follow-up population-based surveys are planned to evaluate the community intervention. The purpose of this project is to evaluate new and innovative interventions to prevent diabetes and its complications. Knowledge gained from this project will be used to develop similar intervention projects in other communities.

  17. Can (or when) Non-Research Become Research? A non-research activity can develop or contribute to generalizable knowledge after the project is undertaken even though generating this knowledge was not part of the original purpose. In this case, because the purpose was not to develop or contribute to generalizable knowledge, the project is not classified as research at the outset. However, if subsequent analysis of identifiable private information is undertaken to develop or contribute to generalizable knowledge, the analysis constitutes human research that now requires further consideration under 45 CFR part 46.

  18. Your Project Ideas…. Research or Non-Research?

  19. Development of Community Engaged Research Kurt Lewin 1940s Advocated using research cycle (planning, action, investigating the results of the action) to solve practical problems Lewin rejected positivist belief that data derived from sensory experience, and the logical and mathematical analyses of the data are the exclusive source of authoritative knowledge.

  20. Participatory research links science of research and social activism (community identified issue, research process leading to change) Participatory research serves an identified need or problem

  21. Applied research driven by organization/community identified need or problem Basic Research driven by researcher interests no immediate practical application

  22. Belief among traditional researchers that basic research is: more objective more generalizable more rigorous Supports notion that applied research not “real research”

  23. Drive for objectivity lead to the double blind drug trial –neither physician or participant knows if they are receiving the drug or placebo – belief that the data would be more accurate Now argued that distancing the researcher from people that are being researched increased distrust and therefore decreased the accuracy of the data gathered

  24. Generalizability – assumed in basic research and statistical studies involving large data bases Applied research may not be so generalizable – findings from one community may not be applicable to another – cultural differences and barriers

  25. Internal vs. External Validity Internal Validity: exists if the observed results of a study are real and not caused by extraneous factors (how well/controlled/rigorous) the study is. External Validity: the ability to generalize the study results to other groups and settings beyond those in the current study. Community-engaged research greatly increases the external validity of a study – but are the threats to internal validity too high (study limitations too many, rigor not there?)

  26. Qualitative Research Qualitative research is a method of inquiry employed in many different academic disciplines, traditionally in the social sciences. Qualitative researchers aim to gather an in-depth understanding of human behavior and the reasons that govern such behavior. The qualitative method investigates the why and how of decision making, not just what, where, when. Hence, smaller but focused samples are more often needed than large samples.

  27. In the conventional view, qualitative methods produce information only on the particular cases studied, and any more general conclusions are only propositions (informed assertions). Quantitative methods can then be used to seek empirical support for such research hypotheses. This has led to the more contemporary approach called Mixed Methods Research.

  28. Quantitative Research Quantitative Research refers to the systematic empirical investigation of social phenomena via statistical, mathematical or computational techniques. Quantitative data is any data that is in numerical form such as statistics, percentages. Quantitative researchers ask a specific, narrow question and collect numerical data from participants to answer the question. The researcher conducts a statistical analysis of the data the results of which may be generalizable to alarger population.

  29. Large scale studies may suggest cause and effect relationships that can then be tested in real world settings Community organizations wanting to identify the cause of a community problem may start with the results of an extensive research project (local data sets) before designing a more focused research project

  30. Why Evaluate? 30 Improve existing programs Measure effectiveness Demonstrate accountability Share effective strategies and lessons learned Ensure funding and sustainability Evaluation is a tool that can both measure and contribute to the success of your program

  31. In community settings research is often project based – lead into We Run This City (marathon) presentation

More Related