aps review online nes n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
APS REVIEW & online NES PowerPoint Presentation
Download Presentation
APS REVIEW & online NES

Loading in 2 Seconds...

play fullscreen
1 / 44

APS REVIEW & online NES - PowerPoint PPT Presentation


  • 255 Views
  • Uploaded on

APS REVIEW & online NES. Yana Litovsky Jonathan Francis Carmona. 2013 Online NES . 31 teams used the online NES. 3 teams used additional regional online NES surveys. 5 teams partially collected data with the online NES. 3 teams requested the online NES, but did not use it.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'APS REVIEW & online NES' - talli


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
aps review online nes

APS REVIEW & online NES

Yana Litovsky

Jonathan Francis Carmona

2013 online nes
2013 Online NES
  • 31 teams used the online NES.
    • 3 teams used additional regional online NES surveys.
    • 5 teams partially collected data with the online NES.
    • 3 teams requested the online NES, but did not use it.
    • GEM created online surveys in 13 languages.
    • All teams should use online NES in 2014
      • We will send samples to all teams who did not use online NES in 2013.
gem 2013 participation
GEM 2013 Participation

70 economies 240,000 individual interviews

2013 lateness overview
2013 Lateness Overview

APS PROPOSAL

APS DATA

2013 aps proposals
2013 APS Proposals
  • National Teams consulted with GEM Data before finalizing their proposal (selection of vendor; partial survey reports)
  • Survey Report personalized for team; includes previous year’s Data Quality Report
  • More teams approved with conditions (send interim data, keep track of quotas, check sampling distribution)
  • Using new team of Data Quality Analysts (InnovAccer)
2013 aps proposals issues
2013 APS Proposals: issues

1. Late proposal submission

  • And no communication from team

2. Missing information

  • Not all requested methodology data provided
  • Survey Vendor Proposal not submitted
  • Translated APS not submitted

3. Incorrect information

  • Outdated population data
  • Incorrect education categories
  • Proposed survey methods change after proposal approved
2013 aps proposals issues1
2013 APS Proposals: issues

4. Methodology issues

  • Data quality from previous years not addressed
  • Insufficient number of callbacks/contact attempts indicated
  • Biased methodology proposed
    • Overly strict quota-use
    • Over-reliance on fixed line
    • Not including all geographic areas
2013 aps proposals approval
2013 APS Proposals: approval

2013: 33% (23 out of 70) proposals approved without any revisions

2012: 55%(38 out of 69) proposals approved without any revisions

2011: 73%(40 out of 55) proposals approved without any revisions

- Proposal quality down only very slightly

  • Data Quality standards increase each year
  • Teams held to higher standards as they have more experience
  • National context changes
  • More thorough investigation of proposals
2013 aps pilot interim datasets
2013 APS Pilot/Interim Datasets
  • In 2013, 30% (21 out of 70) countries sent pilot or interim data (most were interim datasets)
    • In 2012, 56% (39 out of 69) countries
    • In 2011, 29% (16 out of 55) countries
  • Review of pilot test added to Survey Report. Teams confirmed that mistakes made during pilot testing would be avoided during the rest of APS data collection.
  • 1 team’s pilot data not accepted due to problems
  • 2 teams did not collect required pilot
2013 aps datasets issues
2013 APS Datasets: issues
  • 54teams asked to revise data, due to formatting, recording or data collection issues (53 countries asked in 2012)
    • Missing data
    • Miscoded values
    • Incorrect education categories in survey report
    • Weights not submitted right away
    • Calculated weights not representative
    • Duplicate ID values
    • Income and education vales not defined
    • Income ranges poorly constructed
    • Callback data not collected

IT IS SIMPLE & CRITICAL TO CHECK & FIX THESE ERRORS IN DATA

2013 aps datasets issues1
2013 APS Datasets: issues

2013

  • 50 (out of 70) teams had no skip logic errors
  • 25 (out of 70) teams had no APS data format errors

2012

  • 47 (out of 69) teams had no skip logic errors
  • 20 (out of 69) teams had no APS data format errors

2011

  • 36 (out of 55) teams had no skip logic errors
  • 21 (out of 55) teams had no APS data format errors

2010

  • 33 (out of 60) teams had no skip logic errors
  • 20 (out of 60) teams had no APS data format errors
2013 aps samples
2013 APS Samples
  • In 2013, 32 teams flagged as having unrepresentative sample
  • In 2012, 29 teams flagged as having unrepresentative sample
  • Most required further investigation to determine severity
  • Failure to follow approved methodology more common reason
  • 4teams were required to collect additional data
  • Used to be checked later in the cycle and problems were often not investigated further. Now checked as soon as APS data submitted

MANY UNREPRESENTATIVE FINAL SAMPLES COULD HAVE BEEN AVOIDED IF DATA WAS MONITORED

2013 weights
2013 Weights

2013: 67(out of 70) teams provided weights and 5had to be revised

2012: 40 (out of 69) teams provided weights and 3 had to be revised

2011: 37(out of 55) teams provided weights and 3 had to be revised

2010: 39 (out of 60) teams provided weights and 20 had to be revised

- Teams who do not provide weights were sent calculation formulas by GEM

popular team added demographic questions
Popular Team-Added Demographic Questions
  • Language
  • Marital status
  • Ethnicity, Religion
  • Number of children
  • Profession
  • Details about house
  • Ownership (cars, computers, credit cards, etc)
continuing improvements in 2014
Continuing improvements in 2014
  • Data Quality
    • Will continue to stress interim dataset submission and monitoring
    • Data Quality controls continue to be very strict
  • Deadlines will be strictly enforced
    • Data Team will not wait to release APS results
    • Teams who submit too late to give Data Team to process and review results will not be included in the Global Report
  • Emphasis on core APS variables
slide20

COMMUNICATION WITH GEM DATA TEAM IS KEY

GEM IS ONLY AS GOOD AS ITS DATA

EACH TEAM’S LATENESS AFFECTS THE ENTIRE GEM CYCLE

survey futures

Survey Futures

Jeff Seaman

gem aps requirements
GEM APS Requirements
  • Un-biased representative sample of the adult population of sufficient size to provide reliable estimates of the GEM measures
  • GEM does NOT mandate a particular sampling method or data collection technique
  • National teams (and their survey vendors) determine the solution for their unique situation
  • GEM Data Team must review and approve
the issues
The Issues
  • It is becoming harder (and more expensive) to conduct GEM APS surveys
    • Lower fixed line telephone coverage
    • “Survey fatigue”
    • More mobile population
  • Increased reliance on face-to-face interviews
  • Increased need for mobile telephone sampling
background
Background
  • 2007: What is the quality of GEM data?
    • No quality measures
  • Monitor and ensure quality:
    • Structured RFP process
    • Quality test of submitted APS Data
    • Feedback to teams
  • Scale the processes
    • Grow from 42 teams to 70+ (same staffing level)
the evolution
The Evolution
  • Then: Improve the quality of GEM APS data without large increases to the cost.
  • Now: Reduce the cost of collecting GEM APS data while maintaining our commitment to quality.
how do we reduce costs
How do we reduce costs?
  • Better tools (tablets and smartphones)
  • Use online
  • Make questionnaire shorter
  • Change initial attempts and callbacks
  • Alternative callback data collection
  • Sampling changes
face to face surveys
Face-to-face surveys
  • Increasing proportion of APS data collection
  • Highest error rates of all survey types:
    • Skip logic errors
    • Respondent selection errors
    • Incorrectly coded variables
tablet requirements
Tablet Requirements
  • Supports APS questionnaire (question types, skip logic, data validation, etc.)
  • Easy to add or modify questions.
  • Support for multiple languages
  • Free or low cost
  • Runs on readily-available devices (tablets/smartphones)
conduct interview
Conduct interview
  • GEM testing of Open Data Kit application in Malawi, Chile and South Africa.
  • Has great promise to reduce entry errors.
  • Main cost saving is in reduced need to fix errors or resample.
  • Many other options exist
    • World Bank investigation
    • GEM national teams
select the respondent
Select the Respondent
  • Greatest cause of error - several teams required to resample
packaged tablet solution
Packaged Tablet Solution?
  • Majority of new teams entering GEM:
    • Will require face-to-face sampling
    • Have limited budgets
    • Little experience in conducting scientific samples
    • Limited survey vendor alternatives
  • Provide a packaged solution
    • Free or low cost software, programmed for basic GEM APS
    • Run on low cost hardware
    • Documentation and training (video)
online data collection
Online Data Collection
  • Far less costly
  • Approved for two teams in 2013
  • Quality of the email list is critical
  • Sample list requires testing and approval
  • Not an option for most teams – no suitable sample
  • GEM will do cost-sharing with teams wishing to experiment
make questionnaire shorter
Make questionnaire shorter
  • Most important for telephone surveys
  • Core APS has 48 questions, but only 12 are asked of everyone
  • Virtually all teams are adding additional modules
shorter survey
Shorter survey
  • Review additional modules for length
  • New optional question sets
    • Across all modules
    • Most important questions only
    • Better ability for global special topic reports
    • Overall shorter questionnaire
attempts and callbacks
Attempts and callbacks
  • GEM has used the 3/5 rule for years
  • Move from the “one size fits all” model
  • Requires that vendor record the number of attempts and the number of callbacks for every respondent
  • GEM can then test sensitivity of results to number of attempts and number of callbacks
  • Two teams approved for reduced callbacks in 2013
alternative callbacks
Alternative Callbacks
  • Face-to-face leave a mail back questionnaire
  • Possible use of online version
sampling changes
Sampling changes
  • GEM APS sample requirements are designed to provide an estimate of critical rates (e.g., TEA).
  • NOT designed for detailed examination of characteristics of entrepreneurs
  • Most respondents only used for the denominator.
  • National sample:
    • Base sample of 2000
    • Oversample of just entrepreneurs – use screening questions for block 1 and 2, only interview those who qualify
    • Works only for national samples – not regional
    • Works only for oversamples
questions comments
Questions - Comments
  • data@gemconsortium.org