1 / 45

Identifying Measures and Measurement Strategy

Identifying Measures and Measurement Strategy. Research Methods for Public Administrators Dr. Gail Johnson. Steps in the Research Process. Planning 1. Determining Your Questions 2. Identifying Your Measures and Measurement Strategy 3. Selecting a Research Design

rluciano
Download Presentation

Identifying Measures and Measurement Strategy

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Identifying Measures and Measurement Strategy Research Methods for Public Administrators Dr. Gail Johnson Dr. Johnson, www.researchdemystified.org

  2. Steps in the Research Process Planning 1. Determining Your Questions 2. Identifying Your Measures and Measurement Strategy 3. Selecting a Research Design 4. Developing Your Data Collection Strategy 5. Identifying Your Analysis Strategy 6. Reviewing and Testing Your Plan Dr. Johnson, www.researchdemystified.org

  3. Developing the Measurement Strategy • What specifically do you want to know? • How will you know it? • What data do you need? Dr. Johnson, www.researchdemystified.org

  4. Common Measures • Frequencies, percents • Means, Medians, Modes • Dollars • Percent change • Rates, Ratios • Comparisons Dr. Johnson, www.researchdemystified.org

  5. Measuring Concepts • Defining a concept so that it can be counted. • Some are more easy to quantify than others: • Easier:Household income, average weight, highest level of education • Harder: quality, power, success, social class, intelligence, teaching methods, poverty Dr. Johnson, www.researchdemystified.org

  6. Developing a Measurement Strategy • Conceptual definition: • Key terms • Boundaries • Who, time frame, geographic locations • Operational definition: • How it will be measured in numbers Dr. Johnson, www.researchdemystified.org

  7. Defining Your Key Terms • What teaching methods are best for adults? • What do we mean by teaching methods? • What do we mean by best? • What do we mean by adults? Dr. Johnson, www.researchdemystified.org

  8. Defining Your Terms • Defining your terms means obtaining agreement from the stakeholders about the nature of the question. • It also means translating vague words into specific meanings. Dr. Johnson, www.researchdemystified.org

  9. Defining Your Key Terms Is the content and delivery of the MPA program appropriate to meet the demands of the 21st century? • What do we mean by content? • What do we mean by delivery? • What do we mean by demands of 21st century? • What do we mean by appropriate? Dr. Johnson, www.researchdemystified.org

  10. Defining Your Key Terms Would everyone share your definitions? • Consider: who are the customers and stakeholders? • Would they have different definitions of content, delivery and demands? Dr. Johnson, www.researchdemystified.org

  11. Operational Definition • Operationalizing your terms: • The operations which translates a concept or idea (or construct) into a measurable phenomenon • The trick is to make concepts concrete Dr. Johnson, www.researchdemystified.org

  12. Equilibrium Theory Tested • If there a relationship between financial news stories and market activity, then we should see greater changes in market activity immediately following a major financial story. • The Dow and NASDAQ provide daily reports of activities, including shares traded and monetary value. Measuring “financial news” might be a little trickier but presumably researchers could come up with some ways to operationalize it. Dr. Johnson, www.researchdemystified.org

  13. Setting Boundaries: Setting the Scope of the Research • MPA Program: • All MPA programs? • All those in U.S.? • Just the MPA program you are attending? • Do we really mean the 21st century? • Or do we really mean the next decade? Dr. Johnson, www.researchdemystified.org

  14. Setting Boundaries • Another example: if we want to measure public service motivation • All employees employed by government at every level or just those at the federal level? • Do we want a narrow geographic focus? In DC, big cities, small towns? • All ages? • Time frame: all employees, employees with at least 2 years of experience or 10 years of experience? • Alternatively, do we want to look at those planning to enter the public service as compared to those planning to become business managers? Dr. Johnson, www.researchdemystified.org

  15. Unit of Analysis: Another Way to Define Scope • The measures should match the concepts. • The jargon term is “Unit of analysis:” • What factors are associated with national infant mortality rates? • Measures: national data • Mean Income of countries, proportion of population with medical insurance, ratio of doctors to population, average distance between hospitals Dr. Johnson, www.researchdemystified.org

  16. Unit of Analysis • In contrast:. • What maternal characteristics are associated with infant mortality in New York City? • The unit of analysis is the mothers • Measures: medical history, pre-natal care, income, insurance coverage of the mothers. Dr. Johnson, www.researchdemystified.org

  17. Balance • Best versus Doable • Some questions are easier to answer • Some questions are controversial and measures will be scrutinized • Does global warming exist? Dr. Johnson, www.researchdemystified.org

  18. Measures: Degrees of Difficulty • Did the teachers college reach its goal of having women be 1/3 of the students? • Measure: proportion • Data needed: gender breakdown of students • Difficulty? Not hard if they collected gender information Dr. Johnson, www.researchdemystified.org

  19. Measures: Degrees of Difficulty • Did the college achieve its mix of theorybased courses and applied courses? • Operational definition: which one? • Proportion of courses, in-class hours, out-of-class hours (such as internships or apprenticeships) or credit hours? • While you have to make some choices here, this still can be measured quantitatively. • Difficulty: this is harder because the college would have to accurately collect more data • How do they define “theory-based” and “applied” courses? Dr. Johnson, www.researchdemystified.org

  20. Measures: Degrees of Difficulty • Did the curriculum mix make a difference? • How do we define the concepts of “curriculum mix” and “make a difference”? • Without a clear conceptual definition of these terms, it is hard to develop operational definitions. • Defining “making a difference” might be more challenging than defining “curriculum mix” • What’s your best guess at an operational definition? Dr. Johnson, www.researchdemystified.org

  21. Measures: Degrees of Difficulty • If you are measuring a management training program: • It is easiest to measure the satisfaction of participants with the training at the end of the training. • It is harder to measure whether the participants actually applied the training to their work. • It is hardest to determine the return on investment (ROI) of the management training Dr. Johnson, www.researchdemystified.org

  22. Possible Measures for the Management Training Program • Measure participant satisfaction • Use survey at the end of the program • Measure application of training • Perhaps observe managers behavior after training • Perhaps survey staff to see if they see a difference in managers’ behavior-specific to what was covered in the training program • Measure ROI • Challenge: to calculate dollars saved as a result of these training skills that were applied Dr. Johnson, www.researchdemystified.org

  23. Is the MPA Program Successful? • First: decide on a conceptual definition of success: • Perhaps: the proportion of people who complete the program. • Then operationalize the conceptual definition: • Perhaps: 90% of the people who complete the program within 5 years of starting. Dr. Johnson, www.researchdemystified.org

  24. Is the MPA Program Successful? • But there might be other definitions of success? • How would you define success? Dr. Johnson, www.researchdemystified.org

  25. Key Measurement Issues: Valid, Reliable, Accurate • Are measures valid? • Do the measures measure what counts? • Not everything that can be counted actually matters • Are measures reliable? • Are measures precise? • Are measures accurate? Dr. Johnson, www.researchdemystified.org

  26. Validity • A valid measure is a good measure of the concept • Reported crime is not the same thing as actual crime • The number of books in a library is not the same as measuring the quality of the library • The number of faculty publications is not the same as measuring teaching effectiveness Key question: Does the measure actually measure what the researchers think it is measuring? Dr. Johnson, www.researchdemystified.org

  27. Measuring What Counts • What are the most relevant and important measures? • Trap: measuring what is easy • Joke: searching for your lost keys under a lamppost because that is where the light is. Dr. Johnson, www.researchdemystified.org

  28. Valid Measures? • How would you find out if graduates of the MPA program perform better than those who did not graduate? • One possible measure of performance can be obtained by asking graduates to rate their performance. • What are the advantages and disadvantages? Dr. Johnson, www.researchdemystified.org

  29. Reliability Stability of measures • They consistently measure the same thing in repeated tests • Use a rigid tape measure, not an elastic one Dr. Johnson, www.researchdemystified.org

  30. Reliability • Measured in exactly the same way • Enables you to make comparisons • Examples: • Infant mortality rates • Poverty rates Dr. Johnson, www.researchdemystified.org

  31. Constant Dollars: Standardizing Money for Comparison • Money may be worth less over time • Need to standardize money • Budgets over time: “Constant dollars” are adjusted for inflation • Use Deflators Calculator: http://cost.jsc.nasa.gov/inflateGDP.html Another calculator: very cool: http://www.measuringworth.com/uscompare/ Dr. Johnson, www.researchdemystified.org

  32. Current vs Constant Dollars • Current dollars are not adjusted for inflation • Sometimes called nominal dollars • Constant dollars are adjusted for inflation; they reflect changes in purchasing power or the value of the dollar over time • Sometimes called real dollars • $5,000 in 1980 would be worth $10,752 in 2008 dollars • A car costing $8,000 in 1980 would cost $17,550 in 2009 dollars Dr. Johnson, www.researchdemystified.org

  33. Measuring Poverty in U.S. • Operational Definition of Poverty • The “poverty line” established in 1963 • It was based on the cost of purchasing a food plan for emergency situations • A family of 4 would spend $1,033 per month on that food plan • The poverty rate was calculated at 3 times that amount • A family of 4 earning less than $3,100 would fall below the poverty line. Dr. Johnson, www.researchdemystified.org

  34. How is Poverty Measured? • The poverty line has been adjusted to keep pace with inflation but otherwise, no other changes were made. • In 2008, a family of 4 earning less than $21,200 in the lower 48 states would be below the poverty line Dr. Johnson, www.researchdemystified.org

  35. Is the Poverty Measure Valid? • Critics of all political orientation say “no? • But reasons vary: • It does not include other federal programs, like food stamps, free lunch programs, housing subsides, etc • It does not reflect the real costs of housing, transportation and child care in many parts of the country • It does not take into account people who grow their own food • It does not take into account family composition: teenagers are likely to eat more than a toddler Dr. Johnson, www.researchdemystified.org

  36. Change the Poverty Measure? • Debate has been raging since the mid-1990s. • Possible impacts of change: • If it is adjusted upwards to reflect the costs of housing in certain areas of the country, a greater proportion of people will fall below the poverty line. • If it is adjusted to take into account other subsidy programs, the percent falling below poverty is likely to be less. Dr. Johnson, www.researchdemystified.org

  37. Change the Poverty Measure? • From a research perspective, a change would make measuring poverty over time difficult because the operational definition is different • It will lose comparability • Changing the measuring stick means that the measure is unreliable Dr. Johnson, www.researchdemystified.org

  38. Poverty Measure Paradox • We have a measure which is not considered to be a valid measure of poverty. • But this measure, flawed though it is, is reliable: it is measured in the same way every years (adjusted for inflation). Dr. Johnson, www.researchdemystified.org

  39. Poverty Measure Paradox • We can track whether the poverty rate has gone up or down over time. • If the measure is changed to make it a more valid measure of poverty, it will lose its comparability. • What would you do? Dr. Johnson, www.researchdemystified.org

  40. Accuracy • Have the measures been gathered accurately? • Have they been coded correctly? • Have they been entered into the computer accurately? Dr. Johnson, www.researchdemystified.org

  41. Using Other People’s Data • When using data collected by someone else, make sure you know precisely what they measured and how the measured it. • You need to know their operational definitions • You need to know what they did to assure reliability and accuracy of their data and database • You need to know what problems they encountered and the limitations of their data. Dr. Johnson, www.researchdemystified.org

  42. Takeaway Lesson • Be wary of numbers which are not likely to be known: • Estimates of costs of teenaged births • Estimates of lives saved because of availability of guns • Estimates of the costs of illegal drug use • Estimates of proportion of carbon emissions from cattle ranches • Estimates of future savings in Medicare spending Dr. Johnson, www.researchdemystified.org

  43. Takeaway Lesson • Be wary of numbers that change because of changes in reporting • Impact of community policy might initially appear to be more crime rather than less crime because: • People may feel more comfortable reporting crimes than they did in the past • Remember: an increase in reported crime is not necessarily an increase in actual crime. Dr. Johnson, www.researchdemystified.org

  44. Takeaway Lessons • Good researchers always: • Tells you exactly how they developed their measures. • Interpret the data within context of its limitations. • Sophisticated users should ask: • How were the measures defined and actually measured? • Are the measures biased in some way to get a particular result? Dr. Johnson, www.researchdemystified.org

  45. Creative Commons • This powerpoint is meant to be used and shared with attribution • Please provide feedback • If you make changes, please share freely and send me a copy of changes: • Johnsong62@gmail.com • Visit www.creativecommons.org for more information Dr. Johnson, www.researchdemystified.org

More Related