the scientific study of politics pol 51 l.
Skip this Video
Download Presentation
The Scientific Study of Politics (POL 51)

Loading in 2 Seconds...

play fullscreen
1 / 29

The Scientific Study of Politics (POL 51) - PowerPoint PPT Presentation

  • Uploaded on

The Scientific Study of Politics (POL 51) . Professor B. Jones University of California, Davis. Today . Measurement/Observation Asking Questions…Surveys Ethical Issues in Research. At Long Last, Measurement. Data are clearly important But how clear are your data?

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
Download Presentation

PowerPoint Slideshow about 'The Scientific Study of Politics (POL 51)' - Patman

Download Now An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
the scientific study of politics pol 51

The Scientific Study of Politics (POL 51)

Professor B. Jones

University of California, Davis

  • Measurement/Observation
  • Asking Questions…Surveys
  • Ethical Issues in Research
at long last measurement
At Long Last, Measurement
  • Data are clearly important
  • But how clear are your data?
  • Presumption: you know your y and x i
  • How are you going to measure these things?
  • The concept of operation definition
    • Defining concepts
    • Establishing the “rules” for measurement
  • Gives precise meaning to the particular concept under study
  • Another way of thinking about it is this: gives a precise measurable meaning to a larger concept
  • Liberal? Conservative? Big concepts: we need a rule to establish measurement.
operational definition
Operational Definition
  • Consider your current homework assignment.
  • That is: what was y and x and the concepts underlying them?
  • The operational definition comes into play in understanding how we measure this concept.
  • How am I explicitly doing this in the assignment?
  • Other ways? Of course!
  • Research involves choices. Choices involve measurement.
the concept of levels of measurement
The Concept of Levels of Measurement
  • Textbook definitions
    • Nominal
    • Ordinal
    • Interval
    • Ratio
  • Standard sort of stuff here.
  • The different types do have implications for analysis.
the jacoby article
The Jacoby Article
  • …which you must read!
  • Among his main arguments are…?
  • Measurement is inherently theory testing.
  • Why does he say this; what does this mean?
  • Consider the way we measure partisanship in this country (and how it’s used in many academic studies).
nes partisanship question
NES Partisanship Question


VCF0301 7-pt Scale Party Identification




Generally speaking, do you usually think of yourself as a Republican, a Democrat, an Independent, or what? (IF REPUBLICAN OR DEMOCRAT) Would you call yourself a strong (REP/DEM) or a not very strong (REP/DEM)? (IF INDEPENDENT, OTHER [1966 AND LATER: OR NO PREFERENCE]:) Do you think of yourself as closer to the Republican or Democratic party?



Strong Democrat

2. Weak Democrat

3. Independent - Democrat

4. Independent - Independent

5. Independent - Republican

6. Weak Republican

7. Strong Republican

9. Apolitical (1966 only: and DK)



0. DK; NA; other; refused to answer; no Pre IW Inap. question not used

assumptions made by measurement
Assumptions Made By Measurement?
  • There exists ordering
  • Symmetry?
  • Independents are in between Democrats and Republicans
  • Do you believe this?
  • The issue with “using” this indicator as a measure for partisanship.
  • We have made explicit assumptions about how the world exists through our very measure.
jacoby argument
Jacoby Argument
  • “Measurement levels … usually regarded as fixed characteristics of data.”
  • They may not be.
  • Meanings of scales may change.
  • Analysis can get very complicated!
  • But let’s return to measurement and operationally defining variables.
a pedagogical illustration
A Pedagogical Illustration
  • Evaluation of Illegal Immigrants=f(Belief in Equality)
  • How would you measure this?
    • What’s the proper unit of analysis?
  • Let’s work on y
    • Any Ideas?
    • Survey?
    • Behavioral Indicators?
  • Suppose we agree on a survey?
    • What kind of question(s) are you going to ask?
  • What you ask constitutes our operational definition of the “moving” parts.
some survey questions
Some Survey Questions
  • Do you like or dislike illegal immigrants?
    • Binary Variable (1=“like”, 0=“dislike”)
    • Nominally measured (it’s a category)
    • Problems? Issues?
  • “Think about illegal immigrants. On a scale of 0, which denotes the “coolest” ratings, and 100, which denotes the “warmest” ratings, where do you rate this group?”
    • “Feeling Thermometer”
    • What is the scale? (Ordinal? Interval? Ratio?)
    • It’s in a gray area, in practice.
    • Problems? Issues?
some more survey questions
Some More Survey Questions
  • “We hear a lot of talk these days about amnesty for illegal immigrants. How about you? Would support or oppose an amnesty program for illegal immigrants?
    • “Root Question”
  • If Oppose, how strongly?
    • Strongly oppose or moderately oppose?
    • “Branch Question” (Strength follow-up)
  • If Favor, how strongly?
    • Strongly favor or moderately favor?
survey questions
Survey Questions
  • End Result: 4-point scale
    • 1. Strongly oppose
    • 2. Moderately oppose
    • 3. Moderately favor
    • 4. Strongly favor
  • Likert-type scale
    • True Likert scales are agree/disagree format
    • All are “semantically balanced.”
    • Ordinally measured
    • Problems? Issues with this one?
  • Other ways to ask about attitudes?
  • Guttman Scales
    • Abortion: Parental Notification  Required Counseling
  • Clearly many ways to measure y
  • A good measure is a “valid one.”
    • Validity: does it measure what it is supposed to measure?
    • This is the essence of validity
    • Think about my Likert-type scale
    • The other questions?
  • Many kinds of validity
    • Face Validity (…seems valid)
    • Content Validity (full domain of concept measured)
    • External Validity
    • Internal Validity (Experiments)
  • Clearly An Important Issue!!
valid measures validity
Valid Measures? VALIDITY
  • Another question: “Do you think illegal immigrants, people who clearly take away jobs from Americans, are bad or good for the economy?
  • Do you support measures for sustainable energy or support policies that continue trends in global warming?
  • We know full well that subtle changes in question wording can induce substantial changes in responses.
  • Another hallmark of good measurement is reliability
    • Does the measurement procedure produce consistent results over repeated trials?
      • Surveys: you would be wary of a question that exhibited high variability over a short period of time.
      • Bathroom Scale: what does reliability mean here?
  • Implications of unreliable data?
  • What produces unreliable data?
    • Bad research designs
    • Unethical Designs
    • Designs to make a predetermined point.
  • Poor Samples.
reliability and validity
Reliability and Validity
  • Not just “academic” concepts
  • Governmental Statistics
    • What if policy hangs on invalid or unreliable data?
    • Sometimes in politics, individuals have incentives to deceive.
  • Again, good design matters!
generating data
“Generating” Data
  • You’ve settled on a measure, now you’ve got to collect (generate, produce, etc.) the data.
  • Variety of Ways
    • Direct Observation: watch and record
      • Richard Fenno’s Home Style
        • “soak and poke”
      • Structured vs. Unstructured (a la Fenno)
    • Participant Observation
      • Advantages? Disadvantages?
    • Overt vs. Covert Observation
      • Issues?
the hawthorne effect
The Hawthorne “Effect”
  • An issue with direct, overt observation.
  • The Problem: people know they’re being observed.
  • The general problem: people may adjust their behavior simply because they’re being observed.
  • The original study is much in doubt as is “the effect.”
  • However…
the hawthorne effect in surveys
The Hawthorne “Effect” in Surveys
  • Granberg and Holmberg (1992)
  • Face-to-Face Interviewing
    • Sort of like “direct observation.”
    • Interviewer Effects
    • Social Desirability
  • People may temper/alter attitudes simply because they know their responses are recorded
  • How might you mitigate this problem?
social desirability effects in surveys
Social Desirability Effects in Surveys
  • The “Bradley Effect”
  • What is it?
    • The theory of the Bradley effect is that the inaccurate polls have been skewed by the phenomenon of social desirability bias. Specifically, some white voters give inaccurate polling responses for fear that, by stating their true preference, they will open themselves to criticism of racial motivation. The reluctance to give accurate polling answers has sometimes extended to post-election exit polls as well. The race of the pollster conducting the interview may factor in to voters' answers.(
  • Possible Implications? Evidence?
  • Projection Effects
  • Contrast Effects
implications for measurement
Implications for Measurement?
  • How much can we trust the polls?
  • The “measure”?
  • Always look at margins-of-error.
  • Bad data?
    • Push polls
    • Polls where the answer is already known
    • We hear about ethics in campaigns…
    • What about research?
    • What are the concerns?
ethics and measurement
Ethics and Measurement
  • There is a trade-off between what we might want to do, and what we can do.
  • Ethical considerations are considerably important here.
  • Studies horribly conceived:
    • The Tuskegee Syphilis Experiment
  • National Research Act (1974)
  • Belmont Report (1979)
  • What about social sciences?
observation and ethics
Observation and Ethics
  • Observation is potentially intrusive.
  • Observation has many serious pitfalls
    • Privacy concerns (think about Internet monitoring and I.P. tracking)
    • Safety concerns (in a variety of forms)
    • Mental or physical health issues
  • Institutional Review Boards
    • What are they?
    • Should you, as a potential student research, be concerned? (Yes)
milgrim experiment 1963
Milgrim Experiment (1963)
  • “Obedience to Authority” S. Milgrim
  • Nature and intent of study
  • Findings (Quote from 1974 article, “The Perils of Obedience”; quote cut and pasted from
    • The legal and philosophic aspects of obedience are of enormous importance, but they say very little about how most people behave in concrete situations. I set up a simple experiment at Yale University to test how much pain an ordinary citizen would inflict on another person simply because he was ordered to by an experimental scientist. Stark authority was pitted against the subjects' [participants'] strongest moral imperatives against hurting others, and, with the subjects' [participants'] ears ringing with the screams of the victims, authority won more often than not. The extreme willingness of adults to go to almost any lengths on the command of an authority constitutes the chief finding of the study and the fact most urgently demanding explanation.
  • Ethical Issues

Image from Wikipedia Commons

stanford prison experiments
Stanford Prison Experiments
  • P. Zimbardo, 1971
  • A Brief Overview
    • Go to for more details (warning: content may be disturbing/offensive to some).
  • Ethical Issues
  • These two examples are obviously extreme to the limit.
other ethical issues
Other Ethical Issues
  • Inaccurate/False Measurement
    • Not always detectable or reported
    • Original Hawthorne Study
  • Conflicts of Interest
    • Can you objectively study a drug’s effectiveness if you’re being funded by the pharmaceutical company who makes the drug?
  • Researcher Bias
    • In politics, this can become a real issue.
  • Proper Attribution
    • Give credit where credit is due.
  • Choice of Research Question can have Emergent Issues