1 / 19

Improving the Design of UK Business Surveys

Improving the Design of UK Business Surveys. Gareth James Methodology Directorate UK Office for National Statistics. Overview of presentation. Background to recent redesigns at ONS Outline of previous and new designs Drivers for project Approach taken and work done

makya
Download Presentation

Improving the Design of UK Business Surveys

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Improving the Design of UK Business Surveys Gareth James Methodology Directorate UK Office for National Statistics

  2. Overview of presentation • Background to recent redesigns at ONS • Outline of previous and new designs • Drivers for project • Approach taken and work done • Implementation and results in 2010

  3. Short-term surveys • Set of surveys, cover different economic sectors: • services (MIDSS) • production (MPI) • retail (RSI) • Similar aims and systems: • Collect similar variables (turnover & employment) • Surveys developed independently • Differ in detail

  4. Short-term surveys: before • MIDSS & Gaps: • separate survey for employment-only • Q employment sub-sample • MPI: • no separate survey for employment-only • M employment to all sample • RSI: • Q employment to all sample

  5. Short-term surveys: after • QBS: • separate survey for employment-only • MBS: • Q employment to sub-sample • RSI: • Q employment to sub-sample • Looks like MBS

  6. Drivers • Change in NACE implies change in design • Opportunity taken to review & redesign surveys: all parts of Statistical Value Chain • Aim to combine surveys, improve design, standardise and streamline processes • Other projects in office: • workforce jobs review • approach to editing • data collection methods

  7. Redesign principles • Remove unnecessary differences; standardise where possible. • One new name adopted for all surveys • reduce potential confusion in respondents • Frequencies standardised • M for turnover; Q for employment • Sub-sampling for employment • reduce burden • Continue with same processing systems • No change to total sample size

  8. Redesign: sample design & estimation • Scope of survey • assess user needs • in-scope / out-of-scope? • industry groups for sampling / publication? • Number of industry strata reduced: • t/o & emp: 330 → 180 • emp-only: 40 → 30 • Employment size bands, and estimator type: • best choice made for each industry

  9. Redesign: sample allocation • Overall sample size constraint • Neyman allocation principle • CV targets for publication groups • Estimation of sh2 • weighted • robust • modelled?

  10. Redesign: data collection • Review of questions and questionnaires: • appropriate questions asked • reduced questionnaire types • fits with Telephone Data Entry project • Cognitive testing for ‘new’ industries

  11. Redesign: editing and imputation rules • Consistent rules introduced across all industries • same approach to non-response • same method for imputation • Testing undertaken to determine optimum methods and thresholds

  12. Backcasting • Historical estimates required on new NACE • Mix of: • domain estimation, calibration to new NACE groups for recent/current periods • conversion matrices for earlier periods • Linking applied to join sections • New seasonal adjustment models

  13. Backcasting - example

  14. Backcasting - example

  15. Backcasting - example

  16. Implementation in practice (1) • Long project: central project management • working groups • early consultation with stakeholders • resources and constraints • New systems required: • training and support • change in working practices

  17. Implementation in practice (2) • Checking quality of results • aggregates first • investigate anomalies: • correct population definition? • all inputs present? • change in assigned classifications? • Unforeseen issues • data collection • changed classifications

  18. Implementation in practice (3) • Customer support • communication • changes to series / website • Pragmatic approach • Aim to review in year’s time

  19. Conclusions • Taken opportunity of change in NACE to improve survey designs • Redesign work in tandem with other projects: telephone data entry, editing review, workforce jobs • Compromises required: not everything can be achieved. • Judgement required: not always one obviously ‘right’ solution • Many successes; survey quality maintained or improved.

More Related