qualitative studies case studies
Download
Skip this Video
Download Presentation
Qualitative Studies: Case Studies

Loading in 2 Seconds...

play fullscreen
1 / 30

Qualitative Studies: Case Studies - PowerPoint PPT Presentation


  • 335 Views
  • Uploaded on

Qualitative Studies: Case Studies. Introduction. In this presentation we will examine the use of case studies in testing research hypotheses: Validity; Quality; Analysis. Case study methodology.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Qualitative Studies: Case Studies' - MikeCarlo


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
introduction
Introduction
  • In this presentation we will examine the use of case studies in testing research hypotheses:
  • Validity;
  • Quality;
  • Analysis.
case study methodology
Case study methodology
  • Case studies are used as a research tool to test hypotheses in an ex post facto manner ie when the researcher CAN NOT control ANY of the variables under investigation.
  • Case studies are therefore usually concerned with investigating how or why events occurred.
case study methodology1
Case study methodology
  • Case studies involve examining what has occurred to identify reasons for the occurrence.
  • The lack of control leads to a number of potential problems with case study based research which, if not controlled, can invalidate any conclusions drawn from the research.
case study methodology2
Case study methodology
  • There are three typical criticisms of the case study methodology:
  • Problems of bias;
  • Lack of generalisability;
  • They are only suitable for explanatory investigations.
problems of bias
Problems of bias
  • Many researchers are concerned with the apparent lack of rigour involved in case study research (as compared to experimental studies) bought about by investigator bias.
  • Relationships may appear simply because you are looking for them.
generalisability
Generalisability
  • Because case studies allow no control over any of the variables many question whether conclusions drawn from one specific example can be applied generally.
exploratory studies
Exploratory studies
  • Many argue that only experimental research can be used to establish causal relationships with case studies being limited in use to the explanatory stage of a research project.
  • This view misses the most important aspect of a case study, its holistic and real-life nature.
design of a case study
Design of a case study
  • Many of the concerns associated with the use of case studies can be minimised by careful design.
  • Validity is the most important issue to be addressed in the design of a case study methodology.
concepts and indicators
Concepts and indicators
  • Most research involves measuring concepts (eg intent).
  • Unfortunately concepts are normally abstract and thus not directly observable.
  • Thus indicators (variables) are specified which operationalise the concept.
concepts and indicators1
Concepts and indicators
  • Unfortunately not all concepts can be easily operationalised and thus the validity of the indicator is drawn into question.
  • One way around this problem is to use multiple indicators each of which are a partial operationalisation of the concept.
validity of indicators
Validity of indicators
  • The next question is how to evaluate the validity and reliability of the indicators: does it measure what it is supposed to measure.
validity of indicators1
Validity of indicators
  • There are various types of validity:
    • Face validity;
    • Criterion validity;
    • Construct validity;
    • Internal validity;
    • External validity.
face validity
Face validity
  • Face validity involves a subjective evaluation of the indicator using:
  • Logic;
  • Common sense;
  • Previous reported studies;
  • Jury/Expert opinion.
criterion validity
Criterion validity
  • Criterion validity uses a well established indicator to compare the performance of a newly defined indicator.
  • The new indicator behaves in a similar manner to the established one then validity is assumed.
construct validity
Construct validity
  • Construct validity involves examining one indicator and its relationships to other presumed indicators of the same concept.
  • High correlation between the indicators implies they are valid measures of the concept.
  • Perfect correlation implies redundancy.
construct validity1
Construct validity
  • In case studies three strategies can be used to increase construct validity:
    • multiple sources of evidence;
    • chains of evidence;
    • informants comments.
internal validity
Internal validity
  • Internal validity examines the extent to which a research finding (eg A lead to B) is valid.
  • Internal validity involves eliminating all other relationships between other variables (eg C) and B.
internal validity1
Internal validity

Independent

Variable

Dependent

Variable

slide20

Internal validity

Other possible

Variables

X

Independent

Variable

Dependent

Variable

internal validity2
Internal validity
  • There are numerous factors that can effect internal validity.
  • History/Maturity
    • a long time between observations can cause problems.
  • Testing
    • people behave differently under test conditions.
internal validity3
Internal validity
  • Selection
    • observed differences could be the result of differences within the group.
  • Mortality
    • systematic dropout from the study.
internal validity in case studies
Internal validity in case studies
  • The nature of case studies require inferences to be made during data collection.
  • Is the inference correct?
  • Have all possibilities been considered?
  • Is the evidence convergent?
  • The overall quality depends on the quality of the investigator.
external validity
External validity
  • External validity is concerned with the applicability of the research results to other (non examined) populations.
  • Testing
    • attitudes may change as a result of the questioning and the sample thus be unrepresentative.
external validity1
External validity
  • Sample selection
    • self-selecting samples could be biased - include details of those who refused to cooperate.
  • In case studies the use of multiple case studies usually satisfies the requirements for external validity
reliability
Reliability
  • Reliability is concerned with the reproducibility of measurements.
  • When attitudes are examined how do you know that each respondents ratings are the same?
  • Interview structure, telephone protocols, standard letters, rigour in data collection and pilot studies all help in improving reliability.
analysing case study data
Analysing case study data
  • Each case study must be reported in detail:
    • the visit;
    • interviews;
    • conversations;
    • facts;
    • evidence in support/rejection of hypotheses
analysing case study data1
Analysing case study data
    • conclusions;
    • outstanding issues:
      • further investigation;
      • other case studies.
  • Across case studies:
    • replication;
    • rigourous thinking.
summary
Summary
  • A good case study must be:
  • Significant;
  • Complete;
  • Consider all perspectives;
  • Display sufficient evidence;
  • Compelling.
ad