1 / 19

Epistemology and Methods Data Selection, Operationalization and Measurement May 6 2008

Epistemology and Methods Data Selection, Operationalization and Measurement May 6 2008. Empirical Testing!. Building on Research question/puzzle-driven Model building (concepts, explored key variables, arguments, hypotheses) Underlying causality story Testing/Observation/Measuring

kitty
Download Presentation

Epistemology and Methods Data Selection, Operationalization and Measurement May 6 2008

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Epistemology and MethodsData Selection,Operationalization and MeasurementMay 6 2008

  2. Empirical Testing! • Building on • Research question/puzzle-driven • Model building (concepts, explored key variables, arguments, hypotheses) • Underlying causality story • Testing/Observation/Measuring • Remember: • Weaknesses of research design • More inferences than observations (more IV than cases) • Multicollinearity • Selection Bias • Measurement Errors…

  3. Random Selection vs. Intentional Selection Selection and Danger of Bias • In large-n research (universe of cases, random selection) • In small-n research (intentional selection) • Random Selection and Its Limits • Powerful: automatically uncorrelated with all variables (controlled experiments: random selection, treatment variables (explanatory variables)) • You need to know the universe of cases! • In qualitative research: selection bias is more often present!

  4. Selection Bias • Interview civil servants, which ones? (e.g. Snow-ball technique in elite interviews) • Comparative study on wars, which wars? • The performance of IOs, which IOs? • Example (KKV) • US investment in developing countries as prime cause for internal violence • Selection: Nations with major US investments with great deal of internal violence; nations without major US investments and no internal violence.

  5. Selection on the Dependent Variable • Selection should allow for some variation on the DV (key rule) • Variation could be truncated: only limited observation of variance on DV that exists in real world • Example: Effect of number of accounting courses (IV), salary (DV) • KKV, Figure 4.1 • Other examples: • Why do wars occur: only select wars! • What explains trade disputes?

  6. Various relationsships (Geddes)

  7. Various relationsships (Geddes)

  8. Various relationsships (Geddes)

  9. Various relationsships (Geddes)

  10. Selection on the Dependent Variable • Geddes Examples: • Labor repressions has caused high growth rates • Skocpol’s States and Social Revolutions • French, Russian and Chinese revolutions (and contrasting cases: Prussia, Japan) • “…an assessment of the argument based on a few cases selected from the other end of the DV carries less weight than would a test based on more cases selected without reference to the DV…”

  11. Selection on the Explanatory Variable • No inference problem • We could limit the generality of our conclusion, or the explanatory variable does have no effect on DV, but we introduce no bias… • Other research strategy: controlling for an important IV (focus on other explanatory variables)

  12. Measurement Issues Case studies (George and Bennett): • Weighting explanations / IV • Value of primary sources (original purpose of documents) • Bias of secondary sources • Overestimation of rationality of decision-makers • Interpreting unobserved processes (e.g. which data from below was taking up by the top-level? ) • Double-check sources (triangulation)

  13. Watch out for! Validity of Measurements Do we measure what we think we are measuring? e.g. survey questions? Reliability Applying the same method will yield the same results e.g. survey questions? Can results be replicated? This applies to whether measures are reliable (data) but to the entire “reasoning process” that lead to conclusions

  14. Replication Differences quantitative and qualitative research Data and applied methods (e.g. regression analysis) Sources, secondary literature, direct observation (more difficult: impressions and weighting of factors) Ensure access to material for future researchers (data, unpublished/private records) Use coding sheets/ coding rules

  15. Measuring Political Democracy • Bowman et al. 2005 on “data-induced measurement errors” • Challenge of Measuring • Conceptualization • Operationalization by construction of measures • Aggregation of measures

  16. Measuring Political Democracy • Inaccurate, partial or misleading secondary sources, threat to validity! • Remedy: Use of area experts in the coding • Example: Coding of Central American Countries in • Gasiorowski 1996 • Polity IV 2002 • Vanhanen 2000 • Weak correlation among CA cases! Validity issue! Coders measure different things!

  17. Measuring Political Democracy • New coding /index on five dimensions: • - broad political liberties, competitive elections, inclusive participation, civilian supremacy, national sovereignty • All elements are necessary conditions, not standard aggregation • Use of fuzzy-set rules…”weakest score”

  18. Summary: Skepticism! Data / Universe of data Case Selection & samples Conceptualization Operationalization and Measurement Causal logic (e.g. reversed causation) Counterfactual Omitted Variables

  19. And… Discuss seriously strength and weaknesses of rival hypotheses Report Uncertainty!

More Related