Medicaid underreporting in the cps results from a record check study
This presentation is the property of its rightful owner.
Sponsored Links
1 / 82

Medicaid Underreporting in the CPS: Results from a Record Check Study PowerPoint PPT Presentation


  • 122 Views
  • Uploaded on
  • Presentation posted in: General

Medicaid Underreporting in the CPS: Results from a Record Check Study. Joanne Pascale Marc Roemer Dean Resnick US Census Bureau DCAAPOR August 21, 2007. Medicaid Undercount. Records show higher Medicaid enrollment levels than survey estimates (~10-30%)

Download Presentation

Medicaid Underreporting in the CPS: Results from a Record Check Study

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Medicaid underreporting in the cps results from a record check study

Medicaid Underreporting in the CPS: Results from a Record Check Study

Joanne Pascale

Marc Roemer

Dean Resnick

US Census Bureau

DCAAPOR

August 21, 2007


Medicaid undercount

Medicaid Undercount

  • Records show higher Medicaid enrollment levels than survey estimates (~10-30%)

  • Undercount affects many different surveys of health insurance

  • Non-reporting error sources contribute

  • Under-reporting is the largest contributor to the undercount


Current population survey

Current Population Survey

  • Focus is on under-reporting in CPS

    • Produces most widely-cited estimates on health insurance and uninsured

    • Other surveys gauge estimates against CPS; mimic CPS design

  • CPS = monthly survey on labor force and poverty; health insurance questions asked in annual supplement


Cps health insurance questions type by type structure

Job-based

Directly-purchased

Someone outside HH

Medicare

Medicaid

SCHIP

Military

Other

CPS Health Insurance Questions: ‘Type by Type’ Structure


Cps health insurance questions calendar year reference period

CPS Health Insurance Questions: Calendar Year Reference Period

  • Survey is conducted in March

  • Questions ask about coverage during previous calendar year

  • “At any time during 2000, was anyone in this household covered by [plan type]?”


Cps health insurance questions household level design

CPS Health Insurance Questions: Household-level Design

  • Multi-person household:

  • At anytime during 2000 was anyone in this household covered by [plan type]?

  • [if yes] Who was that?

  • Single-person household:

  • At any time during 2000 were you covered by [plan type]?


Cps cognitive testing

CPS Cognitive Testing

  • Three main sources of misreporting:

  • Type-by-type structure: Rs ‘pre-report’ and try to ‘fit’ coverage in earliest question

  • 12-month reference period: some respondents focus on current coverage or ‘spell’

  • Household size and complexity


More on hh size and complexity

More on HH Size and Complexity

  • Rs forgot about certain HH members

  • Rs did not know enough detail about other HH members’ plan type

  • Neither problem related to ‘closeness’ between R and referent; affected housemates, distant relatives but also parents, siblings, live-in partners


Shared coverage hypothesis

Shared Coverage Hypothesis

  • Health insurance administered in ‘units’

    • Private and military coverage: nuclear family

    • Medicaid and ~SCHIP: parent and children

    • Medicare: individual

  • Any given HH may have a mix of units

  • E.g.: R on his union plan; mother on Medicare; sister and child on Medicaid; live-in partner and her child on her job plan

  • R may be able to report more accurately for HH members who are in their same unit (i.e.: share the same coverage type)


Methods

Methods

  • Linked CPS survey and ‘MSIS’ record data for year 2000

  • Analysis Dataset: CPS sample members…

    • known to be on Medicaid according to records

    • for whom a direct response to ‘Medicaid’ was reported in CPS (not edited or imputed)

      • Several items fed into ‘Medicaid’ indicator (Medicaid, SCHIP, other government plan, other)

    • n = 19,345

  • Dependent var = whether Medicaid was reported for the known enrollees


Shared coverage variable

Shared Coverage Variable

  • Referent (person reported on) is R (self-report)

  • in single-person hh

  • in multi-person hh

  • Referent is not R (proxy report)

  • But both are on same Medicaid case

  • But both are on Medicaid (different cases)

  • Referent is on Medicaid; R is not


Logistic regression model

Logistic Regression Model

  • Dependent var = Medicaid status reported in CPS

  • Independent vars:

  • HH composition

    • Shared coverage var

    • Another HH member had Medicaid w/in year

  • Recency and intensity of coverage

    • Most recent month referent enrolled

    • Proportion of days covered from January till last month enrolled

    • Referent covered in survey month

  • Referent received Medicaid services w/in year

  • Demographics

    • Sex of R

    • Age and race/ethnicity of referent


Results overview of linked dataset

Results: Overview of Linked Dataset

  • Of 173,967 CPS hh members, 19,345 (11.1%) had Medicaid according to records

  • Medicaid was reported in CPS for only 12,351 (7.1%) hh members

  • => 36.2% under-reporting


Results overall regression

Results: Overall Regression

  • Model is highly significant in explaining misreporting

  • Effect of each variable is significant and highly discernible

  • Ranked each of the 9 independent vars according to its importance to model


Ranking of independent vars

Ranking of Independent Vars

  • Most recent month enrolled

  • Proportion of days covered from January

  • Received Medicaid services w/in year

  • Race/ethnicity of referent

  • Sex of respondent

  • Another HH member had coverage w/in year

  • Age of referent

  • Covered in survey month

  • Shared coverage var


Categorization of independent vars

Categorization of Independent Vars

  • Recency and intensity of coverage

    • Most recent month enrolled

    • Proportion of days covered from January till last month enrolled

    • Covered in survey month

  • Receipt of Medicaid services

    • Received services with/in year

  • Demographics

    • Race/ethnicity of referent (white non-Hispanic)

    • Sex of R

      7.Age of referent

  • HH composition

    • Another HH member had coverage w/in year

    • Shared coverage var


Results shared coverage var

Expected ranking:

A: Self report in single-person HH

B. Self report in multi-person HH

C. Proxy report, same case

D. Proxy report, different case

E. Proxy report; R does not have Medicaid

Actual ranking:

A

C

D/B

D/B

E

Results: Shared Coverage Var


Summary

Summary

  • Recency, intensity of coverage

  • Receipt of Medicaid services

  • Shared coverage

  • All contribute to the saliency of Medicaid to the respondent, which could translate to more accurate reporting

  • Rs in multi-person HHs forget to report their own coverage


Conclusions

Conclusions

  • 1. Key components of wording are problematic:

  • “At any time during calendar year…”

  • “…was anyone in this household covered…”

  • Explore questionnaire design alternatives

  • 2. Reporting accuracy goes up if R and referent both have Medicaid

  • Explore questionnaire designs to exploit this

  • See if results apply to other coverage types


Thoughts on next steps

Thoughts on Next Steps

  • Reference period:

  • start with questions about current status

  • ask when that coverage began

  • ‘walk’ back in time to beginning of calendar year

  • 2. Other hh members and shared coverage:

  • Start with R’s coverage

  • For each plan type reported ask if other hh members are also covered

  • Continue asking about other hh members by name


Thank you

THANK YOU!!

  • [email protected]

  • [email protected]

  • [email protected]


Medicaid underreporting in the cps results from a record check study

Finding low-income telephone households and people who do not have health insurance using auxiliary sample frame information for a random digit dial survey

Tim Triplett, The Urban Institute

David Dutwin, ICR

Sharon Long, The Urban Institute

DCAPPOR Seminar

August 21, 2007


Presentation overview

Presentation Overview

Purpose:Obtain representative samples of adults without health insurance and adults in low (less than 300 percent of the federal poverty level (FPL)) and medium (between 300 and 500 percent FPL) income families while still being able to produce reliable estimates for the overall population.

Strategy:Telephone exchanges within Massachusetts were sorted in descending order by concentration of estimated household income. These exchanges were divided into three strata and we oversampled the low and middle income strata.

Results:Oversampling of low and medium income strata did increase the number of interviews completed with adults without health insurance as well as adults living at or below 300 percent FPL.


About the study

About the Study

  • Telephone survey conducted in Massachusetts

  • Collect baseline data prior to implementation of the Massachusetts universal health care coverage plan

  • Started on October 16, 2006, ended on January 7, 2007

  • 3,010 interviews with adults 18 to 64

  • Key sub groups were low and middle income households and uninsured adults

  • Overall response rate 49% (AAPOR rr3 formula)


Sample design features

Sample design features

  • RDD list +2 exchanges stratified by income and group into high, middle, and low income strata

  • Over-sampled the low-income strata (n=1381)

  • Separate screening sample was used to increase sample of uninsured (n=704)

  • More aggressive over-sampling of the low income strata on the screening sample

  • One adult interviewed per household

  • Household with both insured and uninsured adults the uninsured adults had a higher chance of selection

  • No cell phone exchanges were sampled


Percentage of uninsured and low income adults by income strata

Percentage of uninsured and low-income adults by income strata


Medicaid underreporting in the cps results from a record check study

Alternate sampling strategies that could yield enough uninsured respondents without increasing survey costs

  • None – no oversampling of strata – simply increase the amount of screening interviewers

  • OS (2:2:1, 3:2:1) - release twice as much sample in the main study from the low and middle income strata and 3 times as much in the screener survey

  • OS *(3:2:1, 5:3:1) - strategy we used

  • OS (5:3:1, 5:3:1) - same for main and screener

  • OS (5,3:1, 8:4:1) – heavy oversample in screener


Simulation of sample sizes resulting from the various oversampling strategies

Simulation of sample sizes resulting from the various oversampling strategies


Why not go for the largest sample

Why not go for the largest sample

  • Design effects will increase as the sample becomes more clustered

  • Larger design effects means smaller effective sample sizes

  • So comparing different sampling strategies you need to compare effective sample sizes

  • We can only calculate the design effect (and effective sample size) for the sample strategy we employed

  • Isolating the increase in the design effect due to the oversampling allows us to estimate the design effect for the other strategies


Average design effects

Average Design Effects


Medicaid underreporting in the cps results from a record check study

Simulation of effective sample sizes under various oversampling rules taking into consideration design effects


Conclusions1

Conclusions

  • Oversampling using exchange level information worked well

  • Higher oversampling rate for the screener sample may not have been the best strategy

  • Exchanges still cluster enough to use auxiliary information

  • Except for the design we used – these are simulated estimates


Sampling in the next round

Sampling in the next round

  • Consider increasing (slightly) the oversampling rate for the main sample and decreasing (slightly) the rate for the screener sample or use the same rate

  • Need to sample cell phone exchanges

  • Health Insurance coverage likely to be higher

  • Conduct Portuguese interviews


Medicaid underreporting in the cps results from a record check study

Thank YouThe survey was funded by the Blue Cross Blue Shield Foundation of Massachusetts, The Commonwealth Fund, and the Robert Wood Johnson Foundation. The analysis of the survey design was funded by the Urban Institute’s Statistical Methods Group.


Medicaid underreporting in the cps results from a record check study

Switching From Retrospective to Current Year Data Collection in the Medical Expenditure Panel Survey-Insurance Component (MEPS-IC)

Anne T. Kearney

U.S. Census Bureau

John P. Sommers

Agency for Healthcare Research and Quality


Important terms

Important Terms

  • Retrospective Design: collects data for the year prior to the collection period

  • Current Year Design: collects data in effect at the time of collection

  • Survey Year: the year of data being collected in the field

  • Single Unit Establishment vs. Multi-Unit Establishment


Outline

Outline

  • Background on MEPS-IC

  • Why Switch to Current?/Barriers to Switching

  • Impact on Frame and Reweighting Methodology

  • Details of Current Year Trial Methods

  • Results

  • Summary


Background on meps ic general

Background on MEPS-ICGeneral

  • Annual establishment survey that provides estimates of insurance availability and costs

  • Sample of 42,000 private establishments

  • National and state-level estimates

  • Retrospective design


Background on meps ic timing example

Background on MEPS-ICTiming Example

  • Let’s say retrospective design in survey year 2002

    • Create frame/sample in March 2003 using 2001 data from the business register (BR)

    • Create SU birth frame with 2002 data from BR

    • In the field from roughly July-December 2003

    • Reweighting in March-April 2004 using 2002 data from the BR

    • Estimation and publication in May-June 2004


Why switch to a current year design

Why Switch to a Current Year Design?

  • Estimates published about 1 year sooner

  • Some establishments report current data already; current data is at their fingertips

  • Most survey estimates are conducive to current year design

  • Better coverage of businesses that closed after the survey year and before the field operation

  • Some data users in favor of going current


Barriers to switching to a current year design

Barriers to Switching to a Current Year Design

  • One year older data for frame building

  • One year older data for reweighting

    • These could possibly make our estimates very different which we believe means worse

  • Other data users believe retrospective design is better for collecting certain items


Impact on frame

Impact on Frame

Example:Let’s use 2002 survey year again:


Impact on reweighting nonresponse adjustment

Impact on ReweightingNonresponse Adjustment

  • We use an iterative raking procedure

  • We do the NR Adjustment using 3 sets of

  • cells:

    • Sector Groups

    • SU/MU

    • State by Size Group


Impact on reweighting poststratification

Impact on ReweightingPoststratification

  • We use an iterative raking procedure using 2

  • sets of cells:

    • State by Size Group and SU/MU

  • Under the retrospective design for the 2002 survey:


Details of trial methods

Details of Trial Methods

  • One issue for frame:

    • What to do with the births

  • One issue for nonresponse adjustment:

    • What employment data to use for cell assignments

  • Three issues for poststratification:

    • What employment data to use for cell assignments

    • What employment data to use for total employment

    • What payroll data to use to create the list of establishments for total employment


Details of trial methods 2002 survey

Details of Trial Methods2002 Survey


Details of trial methods 2002 survey1

Details of Trial Methods2002 Survey


Details of trial methods 2002 survey2

Details of Trial Methods2002 Survey


Details of trial methods 2002 survey3

Details of Trial Methods2002 Survey


Details of trial methods 2002 survey4

Details of Trial Methods2002 Survey


Results definitions

ResultsDefinitions

  • National level estimates

  • Estimates by firm size

    • Establishments categorized by their firm employment


Medicaid underreporting in the cps results from a record check study

ResultsSurvey Year 2002

* Indicates significant difference


Results survey year 2003

ResultsSurvey Year 2003

* Indicates significant difference


Results survey year 2004

ResultsSurvey Year 2004

* Indicates significant difference


Results survey year 2005

ResultsSurvey Year 2005

* Indicates significant difference


Medicaid underreporting in the cps results from a record check study

ResultsSurvey Year 2002

* Indicates significant difference


Results survey year 20031

ResultsSurvey Year 2003

* Indicates significant difference


Results survey year 20041

ResultsSurvey Year 2004

* Indicates significant difference


Results survey year 20051

ResultsSurvey Year 2005

* Indicates significant difference


Governments sample need survey year data

Governments SampleNeed Survey Year Data

  • For the Governments Sample, we need to wait until survey year data is available:

    • we don’t collect employment from government units to use for our published employment estimates – we use data from the governments frame


Summary1

Summary

  • Many positives with going current – timing

  • Possible frame and reweighting problems but prior year data are a good substitute

  • Tested 4 Trial Methods and found:

    • Estimates of premiums look good and rates looked reasonable

    • Establishment and employment estimates are different but not most important estimates


Summary cont

Summary (cont.)

  • We are planning to switch to a current year design for survey year 2008 using a methodology similar to Method 5.

  • For the Governments Sample, we need to wait until survey year data is available:

    • we don’t collect government unit employment to use for employment totals


Medicaid underreporting in the cps results from a record check study

[email protected]

[email protected]


Dc aapor discussant notes aapor ices encore issues in health insurance

DC-AAPOR Discussant NotesAAPOR/ICES Encore:Issues in Health Insurance

David Kashihara

Agency for Healthcare Research and Quality (AHRQ)

August 21, 2007


Issues in health insurance

Issues in Health Insurance

  • Topic is at the forefront of American consciousness

  • Surveys of health are vital to both policy-makers and researchers

  • Improving these surveys should result in better policies and improved research


Medicaid under reporting pascale roemer resnick

Medicaid Under-reportingPascale, Roemer & Resnick

  • The Problem:

    • Significant amount of Medicaid misreporting

      • 36.2% in the linked data set

    • Undercount probably present in other surveys


Medicaid under reporting pascale roemer resnick1

Medicaid Under-reportingPascale, Roemer & Resnick

  • Linking CPS records to MSIS:

    • Truth: MSIS records

    • Non-Truths?

      • MSIS “no” but CPS “yes” (over-reports)

      • Non-matching records (multiple state claims)

      • Duplicates – were removed in this study

      • How many? Impact?


Medicaid under reporting pascale roemer resnick2

Medicaid Under-reportingPascale, Roemer & Resnick

  • The Solution

  • Good use of survey methodology

    • Cognitive testing

    • Methods

    • Analysis

  • Confirmed the logical

    • Recency, intensity: salience plays big part

  • Found the not-so-logical

    • R’s in multi-psn HH’s sometimes forget to report own coverage


Medicaid under reporting pascale roemer resnick3

Medicaid Under-reportingPascale, Roemer & Resnick

  • Question:

    • If the MSIS is the Truth, how good is the truth?

  • Important result:

    • Findings can hopefully help other surveys of health identify, reduce or adjust for this misreporting


Low income no insurance hh s triplett dutwin long

Low Income, No Insurance HH’sTriplett, Dutwin & Long

  • Lack of health insurance in U.S. a hot topic

    • 13.7 % of U.S., non-institutionalized, < 65 (MEPS, 2004)

  • Low income & no insurance are related


Low income no insurance hh s triplett dutwin long1

Low Income, No Insurance HH’sTriplett, Dutwin & Long

  • Medical Expenditure Panel Survey (MEPS)

    • U.S., non-institutionalized, < 65 population

    • % of persons lacking health insurance: Jan. – Dec. 2004 by income level


Low income no insurance hh s triplett dutwin long2

Low Income, No Insurance HH’sTriplett, Dutwin & Long

  • More info about stratification of exchanges based on income

    • What was used to determine income level?

    • How accurate is this?

    • Are the clusters homogenous? (yes)

  • No cell phone exchanges sampled

    • Cell only population

    • Increase or decrease # of uninsured?

      • My guess: increase # uninsured

      • ages18-24 years highest uninsured group < 65 years (22.5 %)


Low income no insurance hh s triplett dutwin long3

Low Income, No Insurance HH’sTriplett, Dutwin & Long

  • Good use of design effects

    • Measure provides info not always intuitive to the untrained population

    • Some may always assume that more oversampling is better

    • Let statistics work for you


Low income no insurance hh s triplett dutwin long4

Low Income, No Insurance HH’sTriplett, Dutwin & Long

  • If possible, try other factors that affect insurance coverage

    • Age

    • Race/Ethnicity


Low income no insurance hh s triplett dutwin long5

Low Income, No Insurance HH’sTriplett, Dutwin & Long

  • Medical Expenditure Panel Survey (MEPS)

    • U.S., non-institutionalized, < 65 population

    • % of persons lacking health insurance: Jan. – Dec. 2004 by age group


Low income no insurance hh s triplett dutwin long6

Low Income, No Insurance HH’sTriplett, Dutwin & Long

  • Medical Expenditure Panel Survey (MEPS)

    • U.S., non-institutionalized, < 65 population

    • % of persons lacking health insurance: Jan. – Dec. 2004 by race/ethnicity


Retrospective to current year design kearney sommers

Retrospective to Current Year DesignKearney & Sommers

  • Decisions, Decisions, Decisions

    • How close is good enough?

    • Weighted pros & cons list

    • Administrative barriers


Retrospective to current year design kearney sommers1

Retrospective to Current Year DesignKearney & Sommers

  • Good list of pros & cons

  • On the balance:

    • Different data users prefer different designs

    • Best design to please the most data users?

    • Best design for accurate estimates?

    • What is most important?

      • What the users want


Retrospective to current year design kearney sommers2

Retrospective to Current Year DesignKearney & Sommers

  • How good is the Gold Standard (GS)?

    • “Survey-Year Data”

    • Reason it’s a GS

    • GS may have flaws

    • Sometimes methodology changes correct or cancel biases

    • GS is nice to have, but many surveys don’t have this luxury and still produce excellent estimates


Retrospective to current year design kearney sommers3

Retrospective to Current Year DesignKearney & Sommers

  • Well devised study

    • Trials useful to tease out sources of problems

    • Results look promising – a convincing argument to move forward

  • Impact of the “minor” estimates?

    • Found to be different


Retrospective to current year design kearney sommers4

Retrospective to Current Year DesignKearney & Sommers

  • Transition to new design – any contingency plans?

    • In case new design doesn’t work well in reality

    • Concurrent samples (old & new methods)

      • Draw 2nd sample (old method) when items become available

    • Estimate bias between methods

    • Not cost effective or efficient


Issues in health insurance1

Issues in Health Insurance

  • Three very good studies

  • Methods & findings could be applied to other surveys

  • We should be constantly improving surveys & making them more useful


  • Login