quality appraisal of qualitative research n.
Skip this Video
Loading SlideShow in 5 Seconds..
Quality appraisal of qualitative Research PowerPoint Presentation
Download Presentation
Quality appraisal of qualitative Research

Loading in 2 Seconds...

play fullscreen
1 / 68

Quality appraisal of qualitative Research - PowerPoint PPT Presentation

Download Presentation
Quality appraisal of qualitative Research
An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. Quality appraisal of qualitative Research

  2. Introduction of participants • Name and surname • Job title/responsibilities • Department/section • Length of time in post • Brief review of disciplinary background, training • Research experience particularly qualitative research • Types of qualitative research with which involved current or past • Involvement in any qualitative evaluation process • What would you hope to learn from a qualitative study?

  3. Paradigm and method: the relationship between philosophy and research practice What is the nature of reality? What kind of knowledge can we have about reality? How can we investigate reality? What is the picture that we paint of reality?

  4. Key terms • Ontology – basic assumptions about the nature of reality. • Epistemology – basic assumptions about what we can know about reality, and about the relationship between knowledge and reality. • Paradigm - Overarching perspective concerning appropriate research practice, based on ontological and epistemological assumptions • Methodology - Specifies how the researcher may go about practically studying whatever he / she believes can be known.

  5. Ontology What is the nature of reality?  Positivist paradigm:(Realism) Stable, law-like reality ‘out there’  Interpretivist paradigm: Multiple, emergent, shifting reality

  6. Epistemology What is knowledge? What is the relationship between knowledge and reality? • Positivism:  Meaning exists in the world.  Knowledge reflects reality. • Interpretivism:  Meaning exists in our interpretations  Knowledge is interpretation

  7. Ontology, Epistemology Scientific paradigm Methodology Knowledge

  8. Paradigms in social science research Three basic paradigms Positivism Interpretivism Constructionism

  9. Positivism • Independence • Value-free • Causality • Hypothesis and Deduction • Operationalization • Reductionism • Generalization • Cross-sectional analysis

  10. Methodology : The Positivist Paradigm  Positivist research involves “… precise empiricalobservations of individual behaviour in order to discover … probabilistic causal laws that can be used to predict general patterns of human activity” (Neuman, 1997: 63)  Objective, value-free discovery

  11. Methodology: The Interpretive Paradigm The study of social life involves skills that “are more like the skills of literary or dramatic criticism and of poetics than the skills of physical scientists.” (Rom Harre, quoted in Phillips, 1987, p105) • Importance of the researcher’s perspective and the interpretative nature of social reality.

  12. Knowledge PositivismInterpretivism Accurate knowledge Knowledge provides exactly reflects the suggestive interpretations world as it is. by particular people at particular times.

  13. key characteristics of qualitative research(1) • A concern with meanings, especially the subjective meanings of participants • A concern with exploring phenomena from the perspectives of those being studied • An awareness and consideration of the researcher’s role and perspective (reflexivity) • ability to preserve and explore context (at the individual level and in the sense of understanding broader social and organizational contexts) • Answering ‘what is’, ‘how’ and ‘why’ questions • Use of unstructured methods which are sensitive to the social context of the study • Naturalistic inquiry (Study real-world situations as they unfold naturally—no manipulation or intervention) • Prolonged immersion in, or contact with, the research setting • The absence of methodological orthodoxy and the use of a flexible (emergent) research strategy

  14. key characteristics of qualitative research(2) • Capture of the data which are detailed, rich and complex (use of ‘thick description’) • A mainly inductive rather than deductive analytic process • Attention paid to emergent categories and theories rather than sole reliance on a priori concepts and ideas • The collection and analysis of data that are mainly in the form of words (textual data) and images rather than numbers • A commitment to retaining diversity and complexity in the analysis • Development rather than setting of hypotheses • Explanations offered at the level of meaning, or in terms of local ‘causality’ (why certain interactions do or do not take place) rather than ‘surface workings’ or context-free laws • Holistic perspective (study the whole phenomenon) • Employs variety of methods including: exploratory interviews; focus groups; observation (participatory and non-participatory); conversation; discourse and narrative analysis; and documentary and video analysis.

  15. Selection of research strategy

  16. Sampling • Quantitative:Statistical sampling (maximizing external validity or Generalization) • Qualitative:Theoretical sampling (Glaser and Straus, 1967) or purposive sampling (Lincoln and Guba, 1985), rather than conventional or statistical sampling • In theoretical sampling, the relation between sampling and explanation is iterative and theoretically led • The purpose of purposive sampling to maximise information, not to facilitate generalisation. • Deliberate inclusion of a wide range of types of informants with access to important sources of knowledge • The criterion used to determine when to stop [purposive] sampling is informational redundancy, not a statistical confidence level (Data Saturation)

  17. Data collection • All qualitative data collection methods involve collecting data in the form of words, talk, experience and actions (some degree of interaction between researcher and participants with the exception of document analysis) • Interviewing (from unstructured to totally structured) • Focus group (ideal between 6-8 people) • Observation (participant observation and non-participant observation) • unstructured diary-keeping and journals (where these have been written specifically for a research project) • Analysis of existing documents or audio-visual media (contemporary or historical sources) • Discourse analysis • Conversation analysis • Biographical methods such as life histories

  18. Sources of evidence and their strengths and weaknesses

  19. Data Collection Methods • Interview • Unstructured • Area of interest may be specified, but all else occurs impromptu • Partially structured • Area is chosen and questions are formulated a priori, but interviewer decides on the order as interview occurs • Semistructured • Area, questions, and order are predetermined. Questions are open-ended, interviewer records essence of response • Structured • Area, questions, and order are predetermined. Questions are open-ended, responses are coded by interviewer as given • Totally structured • Area, questions, and order are predetermined. Respondent is provided with alternatives for each question (i.e., multiple-choice)

  20. Data Collection Methods • Focus group (4 -12 participants) • Capitalize on communication between research participants to generate data • Highlighting the respondents’ attitudes, priorities, language and framework of understanding

  21. Data Collection Methods • Observation • Nonparticipant • Unobtrusive (to greatest extent possible) • Researcher not engaged in activities of group/situation under study • Participant • Researcher is engaged in activities of group/situation under study • Participant-as-observer, Observer-as-participant • Researcher has primary role, but “moonlights” as other

  22. Data Collection Methods • Historical/archival • Uses existing records • Written documents • Video recordings or film • Audio recordings • Combination

  23. Data analysis • Several different strategies for analysis • An explanation for a particular phenomenon, experience or institution rather than a mere description of a range of observations, responses or narrative accounts of subjective experience • Exploring concepts and establishing linkage between concepts implied in the research question and the data-set; and provides explanations for pattern or ranges of reasons or observations from different sources

  24. Data analysis • Starting from data collection phase (interim analysis or sequential analysis) • Content analysis (often media and mass communications- cots items) • Inductive (categories derive gradually from data) or deductive (at the beginning or part way through the analysis as a way of approaching data) • Grounded theory: developing hypotheses from the “ground” or research field upwards rather defining them a priori • Grounded theory: (the inductive process of coding incidents in the data and identifying analytical categories as they “emerged form” the data) • Deductive forms are increasingly being used in applied qualitative analysis (e.g. “framework approach”:both deductive and inductive approaches)

  25. Framework Analysis 1) Familiarization 2) Identifying the thematic framework 3) Indexing 4) Charting 5) Mapping and interpretation

  26. Computer Assisted Qualitative Data Analysis (CAQDAS) • software packages to facilitate the management, processing and analysis of qualitative data. Examples include: 1- ETHNOGRAPH 2-ATLAS-Ti 3-NUD*IST 4-QSR 5-NVivo • None of the software packages is able to do the analysis and the researcher is still responsible for developing a coding scheme, interpreting all the data and formulating conclusions!

  27. Reporting • Clear links between original data, interpretation and conclusion • Clear and Coherent • Selection and presentation of appropriate and adequate data • Presenting the emergence of themes and concepts

  28. Reflexivity • Conducting qualitative research exposes the personal influence of the researcher far more than quantitative methods, as theresearcher is central to data collection, analysis and interpretation. Within the qualitative research paradigm, a high degree of ‘reflexivity’ on the part of the researcher is required throughout the research process. • Researchers need to take in to account the way that their own background and social position, a priori knowledge and assumptions affect all aspects of research: development and design, data collection, analysis and interpretation (Jaye, 2002).

  29. Reflexivity • Mays and Pope (2000) relate the concept of ‘reflexivity’ to sensitivity to the way in which the researcher and research process have both formed the data. Through personal accounting, the researchers become aware of how their own position (e.g. gender, race, class, and power within the research process) and how these factors necessarily shape all stages of data collection and analysis (Hertz, 1997).

  30. How Do We Evaluate Outputs of Qualitative Research? Conceptual themes Contributory Defensible in design Rigorous in conduct Credible in claim Spencer, L., Ritchie, J., Lewis, J., Dillon, L. (2003). Quality in Qualitative Evaluation: A framework for assessing research evidence. Government Chief Social Researcher’s Office, Cabinet Office, United Kingdom.

  31. Identification of some underlying central concerns and principles • Defensibility of design: By providing a research strategy that can address the evaluative questions posed • Rigour of conduct: Through the systematic and transparent collection, analysis and interpretation of qualitative data • Credibility of claims: Through offering well-founded and plausible arguments about the significance of the evidence generated • Contribution to the knowledge and understanding (e.g.about theory, policy, practice, or a particular substantive field)

  32. Lincoln and Guba’s ‘naturalistic’ criteria(Trustworthiness)

  33. Triangulation Triangulation is a strategy which can be used to corroborate the validity of research findings. Its types as defined by Denzin (1984); 1) Data sources Triangulation: Collection of data from various relevant groups and stakeholders with an interest in the phenomenon under study. 2) Investigator triangulation: The use of several researchers to study the same phenomenon using the same method. 3) Theory triangulation: Refers to the strategy used when different investigators with different perspectives interpret the same data/results (Multidisciplinary team). 4) Methodological triangulation: Utilization of various methodologies to examine a particular phenomenon.

  34. Validity and reliability issues

  35. Quality standards in qualitative research • Widespread concerns about quality • Rigour • Robust • Relevance • Utility of research

  36. Addressing the ‘holy trinity’ • no escape from holy trinity (‘validity’, ‘reliability’ and ‘objectivity’) • identified underlying themes: • internal validity (procedural/methodological;interpretive / accuracy or credibility of findings; relational / outcomes in relation to participants) • external validity (relevance; generalisability; auditability; contextual detail) • reliability (replication; consistency; auditability) • objectivity (neutral/value free; auditability; reflexivity) • soundness / well-foundedness vs goodness / worthwhileness

  37. The whole idea of qualitative standards or criteria • Many different ‘positions’ • rejection of criteria for philosophical or methodological reasons • proposal of ‘alternative’ criteria (unrelated to notions of rigour or credibility) • proposal of ‘parallel’ criteria (addressing notions of rigour or credibility) • adoption of traditional ‘scientific’ criteria (to be applied rather differently)

  38. The idea of criteria (contd.) • concern about rigid checklists • concern about ‘tick box’ mentality • avoided the term ‘criteria’ • adopted series of flexible open-ended questions around guiding principles and quality issues • retained centrality of experience and judgement, not mechanistic rule-following • Qualitative research should be assessed on its ‘own terns’ within premises that are central to its purpose, nature and conduct

  39. Against universal criteria Different philosophical assumptions of qualitative methods. The diversity of qualitative methods makes a universal criteria irrelevant. Qualitative studies are not feasible for systematic reviews. Favour universal criteria Research question dictates the design. All findings should emerge from the participant’s experiences (credibility). Urgent need to develop a consensus around what would constitute a ‘good enough’ appraisal tool for qualitative and/or multi-method studies. The debate

  40. Developing consensus? • Over 100 quality appraisal forms to evaluate qualitative research. • Discrepancies of how these tools attempt to appraise the quality of qualitative research. • Many do not distinguish between different study designs, theoretical approaches, and standards for rigor, credibility and relevance. • The majority of these appraisal tools have not themselves been systematically tested.

  41. Why develop frameworks? • Growing emphasis on ways of formalising quality standards • Appraising the existing research literature • Growing use of systematic review • No explicitly agreed standards regarding what constitute quality in qualitative policy evaluation method • No agreed formal criteria for judging the quality of qualitative evaluation research

  42. Why develop frameworks? • Produce a set of criteria that researchers and policy makers can use to assess the extent to which a particular study demonstrate attention to key quality issues • Provide guidance on how standards can be used in appraising individual studies • For the use of commissioners and managers of research; funders of research, government-based policy makers who use qualitative research; experts and academics and researchers conducting qualitative research

  43. Critical Appraisal Skills Programme (CASP) 1) Was there a clear statement of the aims of the research? Consider: – what the goal of the research was – why it is important – its relevance

  44. Critical Appraisal Skills Programme (CASP) 2) Is a qualitative methodology appropriate? Consider: – if the research seeks to interpret or illuminate the actions and/or subjective experiences of research participants

  45. CASP- Appropriate research design 3) Was the research design appropriate to address the aims of the research? Consider: – if the researcher has justified the research design (e.g. have they discussed how they decided which methods to use?)

  46. CASP-Sampling 4) Was the recruitment strategy appropriate to the aims of the research? Consider: – if the researcher has explained how the participants were selected – if they explained why the participants they selected were the most appropriate to provide access to the type of knowledge sought by the study – if there are any discussions around recruitment (e.g. why some people chose not to take part)

  47. CASP-Data collection (1) 5) Were the data collected in a way that addressed the research issue? Consider: – if the setting for data collection was justified – if it is clear how data were collected (e.g. focus group, semi-structured interview etc) – if the researcher has justified the methods chosen

  48. CASP-Data collection (2) Consider: – if the researcher has made the methods explicit (e.g. for interview method, is there an indication of how interviews were conducted, did they used a topic guide?) – if methods were modified during the study. If so, has the researcher explained how and why? – if the form of data is clear (e.g. tape recordings, video material, notes etc) – if the researcher has discussed saturation of data

  49. CASP-Reflexivity 6) Has the relationship between researcher and participants been adequately considered? Consider whether it is clear: – if the researcher critically examined their own role, potential bias and influence during: *formulation of research questions *data collection, including sample recruitment and choice of location – how the researcher responded to events during the study and whether they considered the implications of any changes in the research design