1 / 154

Overview of 2010 EHC-CAPI Field Test and Objectives

Overview of 2010 EHC-CAPI Field Test and Objectives. Jason Fields Housing and Household Economic Statistics Division US Census Bureau Presentation to the ASA/SRM SIPP Working Group November 17, 2009. “Re-SIPP” Development. * Following successful completion of the EHC Paper Field Test.

pekelo
Download Presentation

Overview of 2010 EHC-CAPI Field Test and Objectives

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Overview of2010 EHC-CAPI Field Testand Objectives • Jason Fields • Housing and Household Economic Statistics Division • US Census Bureau • Presentation to the ASA/SRM SIPP Working Group • November 17, 2009

  2. “Re-SIPP” Development * Following successful completion of the EHC Paper Field Test

  3. “Re-SIPP” Development * Following successful completion of the EHC Paper Field Test * Develop the 2010 plan to test an electronic EHC instrument

  4. “Re-SIPP” Development * Following successful completion of the EHC Paper Field Test * Develop the 2010 plan to test an electronic EHC instrument * Broad involvement across Census Bureau - DID - FLD - TMO - DSD - HHES - DSMD - SRD

  5. Primary Goals of 2010 Test

  6. Primary Goals of 2010 Test (1) Strong evidence of comparable data quality

  7. Primary Goals of 2010 Test (1) Strong evidence of comparable data quality - How well do the calendar year 2009 data from the 2010 EHC-CAPI Field Test match data from the 2008 SIPP panel?

  8. Primary Goals of 2010 Test (1) Strong evidence of comparable data quality - How well do the calendar year 2009 data from the 2010 EHC-CAPI Field Test match data from the 2008 SIPP panel? - Especially for income transfer programs

  9. Primary Goals of 2010 Test (1) Strong evidence of comparable data quality - How well do the calendar year 2009 data from the 2010 EHC-CAPI Field Test match data from the 2008 SIPP panel? - Especially for income transfer programs (2) Strong evidence to guide development and refinement before implementation in 2013 as the production SIPP instrument

  10. Basic Design Features (1)

  11. Basic Design Features (1) 8,000 Sample Addresses

  12. Basic Design Features (1) 8,000 Sample Addresses - could have been larger! - enough sample and budget to support research and field activities

  13. Basic Design Features (1) 8,000 Sample Addresses - could have been larger! - enough sample and budget to support research and field activities “High Poverty” Sample Stratum

  14. Basic Design Features (1) 8,000 Sample Addresses - could have been larger! - enough sample and budget to support research and field activities “High Poverty” Sample Stratum - to evaluate how well income transfer program data are collected

  15. Basic Design Features (1) 8,000 Sample Addresses - could have been larger! - enough sample and budget to support research and field activities “High Poverty” Sample Stratum - to evaluate how well income transfer program data are collected State-Based Design

  16. Basic Design Features (1) 8,000 Sample Addresses - could have been larger! - enough sample and budget to support research and field activities “High Poverty” Sample Stratum - to evaluate how well income transfer program data are collected State-Based Design - likely (possible?) access to admin records

  17. Basic Design Features (2)

  18. Basic Design Features (2) Field Period: Early Jan - mid March 2010

  19. Basic Design Features (2) Field Period: Early Jan - mid March 2010 - collect data about calendar year 2009

  20. Basic Design Features (2) Field Period: Early Jan - mid March 2010 - collect data about calendar year 2009 Field Representative training in Dec/Jan

  21. Basic Design Features (2) Field Period: Early Jan - mid March 2010 - collect data about calendar year 2009 Field Representative training in Dec/Jan - goal: minimize # of FRs with post-training “down-time” - evaluation and improvement of training

  22. Basic Design Features (2) Field Period: Early Jan - mid March 2010 - collect data about calendar year 2009 Field Representative training in Dec/Jan - goal: minimize # of FRs with post-training “down-time” - evaluation and improvement of training Use FRs with a wide range of experience

  23. Basic Design Features (2) Field Period: Early Jan - mid March 2010 - collect data about calendar year 2009 Field Representative training in Dec/Jan - goal: minimize # of FRs with post-training “down-time” - evaluation and improvement of training Use FRs with a wide range of experience Expand RO involvement

  24. Research Agenda

  25. Research Agenda 1. Quantify likely cost savings

  26. Research Agenda 1. Quantify likely cost savings 2. Test the data processing system

  27. Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality

  28. Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality 4. Evaluate “field support” materials

  29. Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality 4. Evaluate “field support” materials 5. Evaluate FR training

  30. Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality 4. Evaluate “field support” materials 5. Evaluate FR training 6. Identify & document instrument “bugs”

  31. Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality 4. Evaluate “field support” materials 5. Evaluate FR training 6. Identify & document instrument “bugs” 7. Identify “interview process” issues

  32. Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality 4. Evaluate “field support” materials 5. Evaluate FR training 6. Identify & document instrument “bugs” 7. Identify “interview process” issues 8. Identify usability issues (esp. EHC)

  33. Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality 4. Evaluate “field support” materials 5. Evaluate FR training 6. Identify & document instrument “bugs” 7. Identify “interview process” issues 8. Identify usability issues (esp. EHC) HOW CAN WE IMPROVE FOR 2013?

  34. Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality 4. Evaluate “field support” materials 5. Evaluate FR training 6. Identify & document instrument “bugs” 7. Identify “interview process” issues 8. Identify usability issues (esp. EHC) HOW CAN WE IMPROVE FOR 2013?

  35. Special Methods 1. Quantify likely cost savings

  36. Special Methods 1. Quantify likely cost savings - new cost code(s) established - timing interview length - exchange between 12-month recall and 3 interviews per year

  37. Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality 4. Evaluate “field support” materials 5. Evaluate FR training 6. Identify & document instrument “bugs” 7. Identify “interview process” issues 8. Identify usability issues (esp. EHC) HOW CAN WE IMPROVE FOR 2013?

  38. Special Methods 2. Test the data processing system

  39. Special Methods 2. Test the data processing system The data collected in this test will be used to develop and test a new data processing system.

  40. Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality 4. Evaluate “field support” materials 5. Evaluate FR training 6. Identify & document instrument “bugs” 7. Identify “interview process” issues 8. Identify usability issues (esp. EHC) HOW CAN WE IMPROVE FOR 2013?

  41. Special Methods 3. Evaluate data quality

  42. Special Methods 3. Evaluate data quality - administrative records

More Related