1 / 31

The SIPP Event History Calendar Field Test: Analysis Plans and Preliminary Report

The SIPP Event History Calendar Field Test: Analysis Plans and Preliminary Report. Jeff Moore Statistical Research Division, U.S. Census Bureau Jason Fields Housing and Household Economic Statistics Division, U.S. Census Bureau ASA/SRM SIPP Working Group Meeting September 16, 2008.

snead
Download Presentation

The SIPP Event History Calendar Field Test: Analysis Plans and Preliminary Report

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The SIPP Event History Calendar Field Test:Analysis Plans and Preliminary Report • Jeff Moore • Statistical Research Division, U.S. Census Bureau • Jason Fields • Housing and Household Economic Statistics Division, U.S. Census Bureau • ASA/SRM SIPP Working Group Meeting • September 16, 2008

  2. Overview • Background: • - SIPP “re-engineering” • - event history calendar (EHC) methods • Goals and Design of the EHC Field Test • Evaluation Plans • Preliminary Results [not yet available]

  3. SIPP Re-Engineering • Implement Improvements to SIPP • - reduce costs • - reduce burden • - improve processing system • - modernize instrument • - expand/enhance use of admin records • Key Design Change: Annual Interviewing

  4. EHC Interviewing (1) • Human Memory • - structured/organized • - links and associations • EHC Exploits Memory Structure • - links between to-be-recalled events • - coherence, consistency, sequence • EHC Encourages Active Assistance to Rs

  5. EHC Interviewing (2) • Evaluation: EHC vs. Q-List Comparisons • - various methods • - in general: positive data quality results • BUT, Important Research Gaps • - data quality for need-based programs? • - extended reference period?

  6. Field Test Goals & Design • Basic Goal: • Can an EHC interview collect data of comparable • (or better) quality than standard SIPP? • - month-level data • - one 12-month ref pd interview vs. three • 4-month ref pd interviews • - especially for need-based programs • Basic Design: • EHC re-interview of SIPP sample HHs

  7. Design Details (1) • Main Sample: • SIPP Wave 10-11-12 Interview Cases • - reported on CY-2007 via SIPP [Fig. 1] • Supplemental Sample: • SIPP Wave 8 Sample Cut Cases • - dropped from SIPP in 2006; “unprimed” • EHC Re-Interview in 2008, about CY-2007

  8. Design Details (2) • Two Sites • - Illinois (all) • - Texas (4 metro areas) • N = 1,945 Addresses • - cooperating HHs in SIPP • Sample Distribution:

  9. Design Details (3) • Administrative Records • (for some characteristics, and with R approval) • - Medicare • - Social Security retirement, disability • - SSI • - TANF • - Food Stamps • - [Medicaid?]

  10. Design Details (4) • EHC Questionnaire [handout] • - paper-and-pencil • - 12-month, CY-2007 reference period • - selected SIPP topics (“domains”) • - start with landmark events • - within domains, anchor on “now” • - month-level (at least) detail • Sample of Addresses, Not People • - post-interview clerical match to SIPP

  11. Design Details (5) • $40 Incentive, Non-Contingent • Same Response Rules as SIPP • - EHC interview for all adults (15+) • - self-response preferred (proxy permitted) • Field Staff: Census Bureau FRs • - most with some interview experience • - ~1/3 with SIPP experience • - 3-day training on EHC methods

  12. Design Details (6a) • Field Period: Mid-April thru Late June 2008 • Outcomes: • - 1,627 HH interviews • - 3,318 individual EHC interviews • - 2,747 EHC Rs matched to SIPP

  13. Design Details (6b)

  14. Evaluation Plans (1) • Compare SIPP and EHC Survey Reports • - same people • - same time period • - same characteristics • Data Quality Comparison using Admin Records • (later) • Evaluation of “Priming” Bias

  15. Evaluation Plans (2) • Other Evaluations • - R debriefing form • - FR “case report” debriefing form • - FR debriefing focus groups • - interview observations • Focus on EHC Interview Process

  16. Compare SIPP/EHC Reports (1a) • 2x2 Consistency Table for “Participation” • (Employed? Enrolled? Insured? etc.) • - for each characteristic • - for each month of CY-2007 • - unweighted / unedited data

  17. Compare SIPP/EHC Reports (1b) • b=c  equivalent data quality • (high if (b+c)/N~0; low if (b+c)/N is large) • b>c  EHC “underreporting” (rel. to SIPP) • b<c  SIPP “underreporting” (rel. to EHC)

  18. Compare SIPP/EHC Reports (1c) • Patterns of Consistency/Inconsistency • - b>c for most months? b<c? mixed? • - early months vs. late months?

  19. Compare SIPP/EHC Reports (2a) • Total Reported Months of “Participation” • - by Qtr / combined Qtrs / whole year

  20. Compare SIPP/EHC Reports (2b) • Patterns of Off-Diag Clustering Across Time • - above for most Qtrs? below? mixed? • - early Qtrs vs. late Qtrs?

  21. Compare SIPP/EHC Reports (2c) • Patterns of Off-Diag Clustering Across Time • - above for most Qtrs? below? mixed? • - early Qtrs vs. late Qtrs? • # Reporting At Least 1 Month of Participation

  22. Compare SIPP/EHC Reports (3) • Other “Participation” Comparisons: • - ANY need-based program participation? • (by month / Qtr / combined Qtrs / year) • or • - ANY health insurance coverage • [etc.] • - alignment/sequencing across domains • (e.g., moves & jobs, employment & • health insurance, etc.)

  23. Compare SIPP/EHC Reports (4a) • Month-to-Month Transitions (yesno; noyes) • SIPP’s Staggered Interview Design: • - each month-pair is a “seam” for ¼ sample • - each month-pair is off-seam for ¾ sample • Compare Reporting of Transitions

  24. Compare SIPP/EHC Reports (4b) • Seam Bias: • - too much Δacross interview “seams” • - too little Δ within a single interview • EHC Δ rates below SIPP’s (seam), and above SIPP’s (off-seam)  Improved Quality

  25. Compare SIPP/EHC Reports (5a) • Income Amount Report Comparisons • - unemployment benefits • - disability income ($) • - workers’ comp • - Social Security ($) • - Medicare Part B deduction ($) • - TANF ($) • - Food Stamps ($) • - SSI ($) • ($)=admin records

  26. Compare SIPP/EHC Reports (5b) • $$ Comparison is Less Straightforward • Continuous $$ Variable  • - arbitrary definition(s) of “agreement” • - disagreements are directional • Limited to “Yes/Yes” Cases

  27. Compare SIPP/EHC Reports (5c) • $$ Reporting Comparisons • - mean amount (EHC; SIPP; difference) • - levels of correspondence • (e.g., ±5%; ±5-10%; ±10-25%; • ±25-50%; >±50%) • - direction of differences • ($EHC > $SIPP; $EHC=$SIPP (±1%); • $EHC < $SIPP) • - timing of amount changes

  28. Assessment of “Priming” (6a) • W-10-11-12 Rs Provide CY-2007 Data Twice • - first SIPP, then EHC • Are Their EHC Reports Biased? • - e.g., more accurate EHC response • - could bias field test interpretation • Control Group: W-8 Sample Cut • - last SIPP response in Jun-Sep 2006 • - “unprimed” re: CY-2007 (not SIPP content)

  29. Assessment of “Priming” (6b) • Compare Distributions for Key Characteristics • - e.g., monthly “participation” reports • - weighted (sub-sampling; attrition) • Similarity of Profiles  Extent/Nature of Priming Bias • Admin Records for Some Characteristics • - meaning of distribution differences • - may also reveal hidden quality diffs

  30. Guidance, Questions, Advice… • Questions? • Thoughts/Comments...? • - on the evaluation approach? • - about additional analyses? • - about how to weigh evidence from the • field test in deciding whether or not to • adopt a 12-month EHC?

  31. Thank you very much! • jeffrey.c.moore@census.gov • 301-763-4975

More Related