1 / 33

A Comparison of Survey Reports Obtained via Standard Questionnaire and Event History Calendar

This study compares survey reports obtained through standard questionnaire and event history calendar methods for the SIPP survey. Preliminary results show high agreement between the two methods for various characteristics and government programs. The study aims to assess data quality and explore any potential differences between the methods.

Download Presentation

A Comparison of Survey Reports Obtained via Standard Questionnaire and Event History Calendar

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A Comparison of Survey Reports Obtained via Standard Questionnaire and Event History Calendar • Jeff Moore, Jason Fields, Joanne Pascale, • Gary Benedetto, Martha Stinson, and Anna Chan • U.S. Census Bureau • American Association for Public Opinion Research • May 14-17, 2009

  2. Overview • Background: • - SIPP; SIPP “re-engineering” • - event history calendar (EHC) methods • Goals and Design of the SIPP-EHC Field Test • Preliminary Results • Summary / Conclusions / Next Steps

  3. SIPP Survey of Income and Program Participation - income/wealth/poverty in the U.S.; program participation dynamics/effects - interviewer-administered; longitudinal - panel length = 3-4 years Key Design Feature: - 3 interviews/year, 4-month reference pd.

  4. SIPP Re-Engineering • Implement Improvements to SIPP • - reduce costs • - reduce R burden • - improve processing system • - modernize instrument • - expand/enhance use of admin records • Key Design Change: • - annual interview, 12-month reference pd., event history calendar methods

  5. EHC Interviewing • Human Memory • - structured/organized • - links and associations • EHC Exploits Memory Structure • - links between to-be-recalled events • EHC Encourages Active Assistance to Rs • - flexible approach to help elicit an • autobiographical “story”

  6. Evaluations of EHC Methods • Many EHC vs. “Q-List” Comparisons • - various methods • - in general: positive data quality results • BUT, Important Research Gaps • - data quality for need-based programs? • - extended reference period?

  7. Field Test Goals & Design • Basic Goal: • Can an annual EHC interview collect data of • comparable quality to standard SIPP? • Basic Design: • EHC re-interview of SIPP sample households

  8. Design Details (1) • Sample: • SIPP 2004 panel interview cases • - reported on CY-2007 in waves 10-12 • EHC re-interview in 2008, about CY-2007

  9. Design Details (2) • SIPP Sample Cases in Two Sites • - Illinois (all) • - Texas (4 metro areas) • N = 1,096 Wave 10-11-12 Addresses • (cooperating wave 11 households) • IL: 487 • TX: 609

  10. Design Details (3) • EHC Questionnaire • - paper-and-pencil • - 12-month, CY-2007 reference period • - subset of SIPP topics (“domains”) • - month-level detail • Sample of Addresses, Not People • - post-interview clerical match to SIPP

  11. Design Details (4) 1096 initial sample addresses Outcomes: - 935 household interviews (91%) - 1,922 individual EHC interviews (99%) - 1,658 EHC Rs matched to SIPP (86%) FINAL ANALYSIS SAMPLE: 1,620

  12. Primary Evaluation • Compare SIPP and EHC Survey Reports • - same people • - same time period • - same characteristics • Differences Suggest Data Quality Effects • (later: use administrative records for a more • definitive data quality assessment)

  13. Main Research Questions • Are responses to Qs about government programs and other characteristics affected by interview method (SIPP vs. EHC)? • Does the effect of interview method vary across calendar months (especially early in the year vs. late in the year)?

  14. Preliminary Results 3 Government “Welfare” Programs: Food Stamps Supplemental Security Income (SSI) Women Infants & Children (WIC) 4 Other Characteristics: Medicare Social Security employment school enrollment

  15. Results in Context Almost All SIPP and EHC Reports Agree - all characteristics, all months - in general: 97-98% likelihood that a respondent’s SIPP and EHC reports will agree - worst case (employment): 92-94% Disagreements are RARE EVENTS

  16. Results Summary 3 Patterns: 1. EHC = SIPP All Year equivalent data quality

  17. SSI -- % Participation for Each Month of CY2007 According to the SIPP and EHC Reports Analysis Summary - no “main effect” for method - no significant method difference in any month

  18. WIC (Illinois Only) -- % Participation for Each Month of CY2007 According to the SIPP and EHC Reports Analysis Summary - no “main effect” for method - no significant method difference in any month

  19. Results Summary 3 Patterns: 1. EHC = SIPP All Year SSI; WIC (IL) 2. EHC < SIPP All Year reduced EHC data quality, but not due to longer recall period

  20. MEDICARE -- % Covered in Each Month of CY2007 According to the SIPP and EHC Reports Analysis Summary - significant “main effect” for method - method difference (SIPP > EHC) is constant across months

  21. SOCIAL SECURITY -- % Covered in Each Month of CY2007 According to the SIPP and EHC Reports Analysis Summary - significant “main effect” for method - method difference (SIPP > EHC) is constant across months

  22. WIC (Texas Only) -- % Participation for Each Month of CY2007 According to the SIPP and EHC Reports Analysis Summary - significant “main effect” for method - method difference (SIPP > EHC) is constant across months

  23. FOOD STAMPS (Illinois Only) -- % Participation for Each Month of CY2007 According to the SIPP and EHC Reports Analysis Summary - significant “main effect” for method - method difference (SIPP > EHC) is essentially constant across months

  24. Results Summary 3 Patterns: 1. EHC = SIPP All Year SSI; WIC (IL) 2. EHC < SIPP All Year Medicare; Social Security; WIC (TX); Food Stamps (IL) 3. EHC < SIPP, Early in the Year Only EHC data quality may suffer due to longer recall period

  25. FOOD STAMPS (Texas Only) -- % Participation for Each Month of CY2007 According to the SIPP and EHC Reports Analysis Summary - no significant “main effect” for method - BUT significant variation by month -- JAN-MAY: SIPP > EHC later months: no difference (reversal?)

  26. EMPLOYMENT -- % Working for Each Month of CY2007 According to the SIPP and EHC Reports Analysis Summary - significant “main effect” for method (SIPP > EHC) - BUT significant variation by month -- JAN-AUG (SEP): SIPP > EHC later months: no difference

  27. SCHOOL ENROLLMENT -- % Enrolled in Each Month of CY2007 According to the SIPP and EHC Reports Analysis Summary - no significant “main effect” for method - BUT significant variation by month JAN-APR: SIPP > EHC JUN-JUL: SIPP < EHC AUG-DEC: no difference

  28. Field Test Overall Summary Successful “Proof of Concept” Overwhelming Finding: SIPP-EHC Agreement Valuable Lessons to Inform Next Test - larger, broader sample - “correct” timing of field period - automated questionnaire Specific Data Comparisons are Instructive

  29. Results Implications Pattern 1. EHC = SIPP All Year SSI; WIC (IL) No evident problems; no reason for concern about data quality in a 12-month EHC interview

  30. Results Implications Pattern 2. EHC < SIPP All Year Medicare; Social Security; WIC (TX); Food Stamps (IL) Problems with data quality in the EHC treatment, but probably not due to recall length - less effective screening questions (no D.I.; fewer probes; no local labels) - different definitions Likely fixes in CAPI

  31. Results Implications Pattern 3. EHC < SIPP, Early in the Year Only Food Stamps (TX); employment; school enrollment Most cause for concern; longer recall period may cause reduced data quality in the earlier months of the year Additional research: - why these characteristics? - understand Field Test time lag effects

  32. .

More Related