1 / 79

Chapter 12

Chapter 12. Critical Appraisal of Quantitative and Qualitative Research for Nursing Practice. Intellectual Research Critique. A careful examination of all aspects of a study to judge: Merits Limitations Meaning Significance. Intellectual Critique Questions.

Download Presentation

Chapter 12

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Chapter 12 Critical Appraisal of Quantitative and Qualitative Research for Nursing Practice

  2. Intellectual Research Critique A careful examination of all aspects of a study to judge: Merits Limitations Meaning Significance

  3. Intellectual Critique Questions Was the research problem significant? What are the major strengths of the study? What are the major weaknesses of the study? Did the researchers use sound methodology? Do the findings accurately reflect reality? Are the findings consistent with those from previous studies?

  4. Intellectual Critique Questions (cont’d) Can the study be replicated by others? What are the implications of the findings?

  5. Intellectual Critique Guidelines Read and critique the entire study. Examine the organization and presentation of the research report. Examine the significance of the probe, studied for nursing practice. Identify strengths and weaknesses of a study. Be objective and realistic in identifying the study’s strengths and weaknesses.

  6. Intellectual Critique Guidelines (cont’d) Provide specific examples of the strengths and weaknesses. Provide a rationale for your critique. Suggest modifications for future studies. Discuss feasibility of replication of the study. Discuss usefulness of findings for practice.

  7. Critique Process for Quantitative Studies Phase 1—Comprehension Phase 2—Comparison Phase 3—Analysis Phase 4—Evaluation

  8. Phase 1—Comprehension Read the article carefully. Identify terms you do not understand and determine their meaning in a dictionary or the glossary of Burns and Grove. Read the article a second time. Highlight each step of the research process.

  9. Phase 2—Comparison Requires knowledge of what each step of the research process should be like The ideal is compared with the real. Must examine the extent to which the researcher followed the rules for an ideal study

  10. Phase 3—Analysis Involves a critique of logical links connecting one study element with another Overall logical development of the study must be examined.

  11. Critique Guidelines or Comparison and Analysis Review research text(s). Compare the steps in the study you are analyzing with the criteria in the research text(s). Analyze the logical links among the steps of the study.

  12. Research Problem and Purpose Is the problem sufficiently delimited in scope without being trivial? Is the problem significant to nursing? Is there evidence of researcher biases? Does the purpose narrow and clarify the aim of the study? Was the study feasible in terms of funding, expertise, subjects, facility, equipment, and ethical considerations?

  13. Literature Review Does it demonstrate progressive development of ideas through previous research? Is a theoretical knowledge base developed for the problem and purpose? Does the literature review provide rationale and direction for the study? Is a clear, concise summary presented of the current empirical and theoretical knowledge in the area of study?

  14. Study Framework Is the framework presented with clarity? If a map or model is presented, is it adequate to explain the phenomenon of concern? Is the framework linked to the research purpose? Would another framework fit more logically with the study?

  15. Study Framework (cont’d) Is the framework related to nursing knowledge? If a proposition from a theory is tested, is the proposition clearly identified and linked to the study hypotheses?

  16. Research Objectives, Questions, or Hypotheses Are the objectives, questions, or hypotheses expressed clearly? Are the objectives, questions, or hypotheses logically linked to the research purpose and framework? Are the research objectives, questions, or hypotheses linked to concepts and relationships from the framework?

  17. Variables Do the variables reflect the concepts identified in the framework? Are the variables clearly defined? Is the conceptual definition of a variable consistent with the operational definition?

  18. Design Is the design used the most appropriate to obtain the needed data? Does the design provide a means to examine all of the objectives, questions, or hypotheses? Have threats to design validity been minimized? Is the design logically linked to the sampling method and statistical analyses?

  19. Design (cont’d) Is the treatment clearly described? Was a protocol developed to promote consistent implementation of the treatment? Did the researcher monitor the implementation of the treatment to ensure consistency?

  20. Sample, Population, and Setting Is the sampling method adequate to produce a representative sample? What are the potential biases in the sampling method? Are any subjects excluded from the study based on age, socioeconomic status, or race, without a sound rationale?

  21. Sample, Population, and Setting (cont’d) Were the sampling criteria appropriate for the type of study conducted? Is the sample size sufficient to avoid a type II error? If more than one group is used, do the groups appear equivalent?

  22. Sample, Population, and Setting (cont’d) Are the rights of human subjects protected? Is the setting used in the study typical of clinical settings? Was sample mortality a problem? If so, how might this influence the findings?

  23. Measurements Do the instruments adequately measure the study variables? Are the instruments sufficiently sensitive to detect small differences? Does the instrument have adequate validity and reliability?

  24. Scales and Questionnaires Are the instruments clearly described? Are techniques to complete and score the instruments provided? Are validity and reliability of the instruments described? If the instrument was developed for the study, is the instrument development process described?

  25. Observation Are the phenomena to be observed clearly identified and defined? Is interrater and intrarater reliability described? Are the techniques for recording observations described?

  26. Interviews Do the interview questions address concerns expressed in the research problem? Are the interview questions relevant for the research purpose and objectives, questions, or hypotheses? Does the design of the questions tend to bias subjects’ responses? Does the sequence of questions tend to bias subjects’ responses?

  27. Physiological Measures Are the physiological measures or instruments clearly described? If appropriate, are brand names of instruments identified? Are the accuracy, selectivity, precision, sensitivity, and error of the instruments discussed?

  28. Physiological Measures (cont’d) Are the physiological measures appropriate for the research purpose and objectives, questions, or hypotheses? Are the methods for recording data from the physiological measures clearly described? Is the recording of data consistent?

  29. Data Collection Is the data collection process clearly described? Is the training of data collectors clearly described and adequate?

  30. Data Collection (cont’d) Is the data collection process conducted in a consistent manner? Are the data collection methods ethical? Do the collected data address the research objectives, questions, or hypotheses?

  31. Data Analysis Are data analysis procedures appropriate to the type of data collected? Are data analysis procedures clearly described? Are the results presented in an understandable way? Do data analyses address each research objective, question, or hypothesis? Are the analyses interpreted appropriately?

  32. Data Analysis (cont’d) Are the statistical analyses logically linked to the design? Is the sample size sufficient to detect significant differences? Was power analysis used to determine sample size?

  33. Interpretation of Findings Are findings discussed in relation to each objective, question, or hypothesis? Are the findings clinically significant? Do the conclusions fit the findings from the analyses? Are conclusions based on statistically significant and clinically significant results? Are there limitations the researcher did not identify?

  34. Phase 4—Evaluation Involves determining the meaning and significance of the study by examining the links among the study process, study findings, and previous studies Study findings are examined in light of previous study findings. Evaluation builds on conclusions reached during the first three stages of the critique and provides the basis for the fifth step—conceptual clustering.

  35. Phase 4—Evaluation (cont’d) The steps of the study are evaluated based on previous studies. Present hypotheses are based on previous hypotheses. Present design is based on previous designs. Present methods of measurement are based on previous measurement.

  36. Critique Guidelines for Evaluation What rival hypotheses can be suggested for the findings? How much confidence can be placed in the study findings? To what populations can the findings be generalized? What questions emerge from the findings, and are these identified by the researcher?

  37. Critique Guidelines for Evaluation (cont’d) What future research can be envisioned? Could the limitations of the study have been corrected? When the findings are examined based on previous studies, what is now known and not known about the phenomenon under study?

  38. Critique Guidelines for Evaluation—Examination of Previous Studies Are the findings of previous studies used to generate the research problem and purpose? Do the findings build on findings of previous studies? Is the design an advance over previous designs? Do sampling strategies show an improvement over previous studies?

  39. Critique Guidelines for Evaluation—Examination of Previous Studies (cont’d) Does the sample selection have the potential for adding diversity to samples previously studied? Does the current research build on previous measurement strategies so that measurement is more precise or more reflective of the variables? How do statistical analysis techniques compare with those used in previous studies?

  40. Critique Guidelines for Evaluation—Examination of Previous Studies (cont’d) Is the current knowledge in this area identified? Does the author indicate the implication of the findings for practice?

  41. Skills Needed to Critique Qualitative Studies Context flexibility Inductive reasoning Conceptualization, theoretical modeling, and theory analysis Transforming ideas across levels of abstraction

  42. Context Flexibility Definition: the capacity to switch from one context or worldview to another, to shift perception so as to see things from a different perspective It is not necessary to become committed to a perspective to follow or apply its logical structure.

  43. Context Flexibility (cont’d) All scholarly work requires a willingness and ability to examine and evaluate works from diverse perspectives. For example, analysis of the internal structure of a theory requires context flexibility.

  44. Inductive Reasoning Skills Necessary so as to follow the logic of a qualitative researcher Used in the transformation process during data analysis Revealed in the move from concrete descriptions to the abstract level of science

  45. Standards for Qualitative Critique Standard 1: Descriptive vividness Standard 2: Methodological congruence Standard 3: Analytical and interpretative precision Standard 4: Philosophical or theoretical connectedness Standard 5: Heuristic relevance

  46. Standard 1—Descriptive Vividness Description of the site and subjects, the experience of collecting the data, and the thinking of the researcher during the process need to be presented so clearly that the reader has the sense of personally experiencing the event. Because one of the assumptions of qualitative research is that all data are context specific, the evaluator of a study must understand the context of that study.

  47. Descriptive Vividness (cont’d) A contextual understanding of the whole is essential and prerequisite to the capability of the reviewer to evaluate the study in the light of the other four standards.

  48. Threats to Descriptive Vividness Failure to include essential descriptive information Lack of clarity and/or depth of description Inadequate skills in writing descriptive narrative Reluctance to reveal self in written material

  49. Standard 2—Methodological Congruence Reviewer must have knowledge of philosophy and methodological approach used by the researcher. Four dimensions: Rigor in documentation Procedural rigor Ethical rigor Auditability

  50. Rigor in Documentation The reviewer examines if the researcher clearly and concisely presents study elements. The reviewer examines the study elements for completeness and clarity. The reviewer identifies any threats to rigor in documentation.

More Related