1 / 24

The Planck Early Release Compact Source Catalog - Status

The Planck Early Release Compact Source Catalog - Status. Bill Reach, Gene Kopan, and Tim Pearson. What it is A list of compact bright sources Galaxies, Quasars, AGN SZ clusters Compact interstellar clouds HII regions and molecular clouds Others A catalog using both HFI and LFI data

lyneth
Download Presentation

The Planck Early Release Compact Source Catalog - Status

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Planck Early Release Compact Source Catalog - Status Bill Reach, Gene Kopan, and Tim Pearson

  2. What it is A list of compact bright sources Galaxies, Quasars, AGN SZ clusters Compact interstellar clouds HII regions and molecular clouds Others A catalog using both HFI and LFI data Returned to DPCs within 6 months of the first “full” sky coverage Released to the public 3 months after provided to DPCs Intended for rapid follow-up of “interesting” sources while Herschel (and other systems) still available What it is not It is not a real-time detection system Not even near-real-time Not appropriate for follow-ups of flaring sources Not appropriate for new solar system object follow-ups It is not the final Planck catalog Calibration will not be finalized. Completeness levels will not guaranteed. There may be false detections Polarization catalog Polarization-sensitive bolometers will be summed Early Release Compact Source Catalog (ERCSC)

  3. ERCSC Specifications • Requirements • Source lists, one for each frequency (9 altogether) • No Requirements on band-merging, uncertainties, or in confused areas (e.g. galactic plane), best reasonable efforts will be made • No requirements for polarization, but ERCSC can be run on Stokes Q,U maps on request from DPCs • The DPCs will release the catalogs 3 months after receipt • Goals • 90% reliability of compact source identifications (over cutoff) • Expected Performance • Flux density cutoff: SNR 10 or better • Flux density accuracy: better than 30% • Positional accuracy: better than FWHM/5 (1 sigma radial)

  4. ERCSC Dependencies • Required Inputs from HFI and LFI DPCs • calibrated, cleaned data with pointing reconstruction for LFI and HFI within 4 weeks of receipt on ground (e.g. weekly deliveries) • Detector calibration (instrument model) for both LFI and HFI • ERCSC to be produced from calibrated data available two months before catalog due date (I.e. calibration available at S1+4 months) • HFI/LFI L2 pipeline codes (and installation help) for preprocessing (NERSC/IPAC) • Realistic simulated data for development and testing – a substantial fraction of a full-sky survey is required, at all frequencies and including as much realism as possible, before launch for integration testing • USDPC provides to HFI and LFI DPCs • source lists within 6 months of completion of first full sky survey coverage • code that produced source list, within six months of delivery of source list • USPDC will participate in DPC and Planck end-to-end test schedules until launch. Intermediate code deliveries will be made then. There is no requirement that the ERCSC code will run outside of USPDC environment. • The USPDC will make no formal code deliveries between launch and delivery of the catalog

  5. L2 products L2 products • high-pass filter (optional) • Assign pointing to samples • Destripe (Springtide) • Assign samples to pixels • Coadd into maps ERCSC Processing LFI DPC HFI DPC FTP FTP IPAC Planck Data Archive LFI Preprocessing HFI L2 Preprocessing USDPC Destriping / Mapmaking ERCSC Source Detection / Extraction ERCSC QA/ Catalog Production ERCSC

  6. ERCSC Operations • ERCSC generated from 1st sky coverage • Receive data…weekly major updates • Make new maps…monthly map updates • Must be able to make map from 7 months of data in << 1 month • Make new source list…monthly update • Software upgrades • Allow for at least 2 complete reprocessings (software versions) before delivery • Versions: P1 = prelaunch pipeline • P2 = first post-launch update, 1-3 months after getting 1st survey data • P3 = final update, operational version to generate product delivered back to Planck project

  7. ERCSC Processing steps • Input calibrated, position-tagged TOD • will use PIOremote and L2 pipeline for HFI • Will use calibrated timeline in exchange format for LFI (under negotiation) • Clean glitches & instrument signatures • Destripe to match scans (Springtide) • Generate maps per detector • signal, sample variance, counts • Unified Mapmaking Code same for both HFI and LFI • Filter maps with symmetric ‘matched’ compact source filters • Detect and extract sources from per band maps • Band merge (best efforts) • Quality Analysis • Output catalog

  8. ERCSC Development Tasks • HFI L2 Preprocessing code • Port to local machines • Evaluate/ augment • USDPC Mapmaking/ GCP/ M3 code • Port to local machines • Evaluate/ augment • Point source detection/ extraction algorithms • Evaluate/ trade off • Implement • Bandmerging code • Quality assessment tools • Pipeline & software integration • End-to-end testing • In conjunction with major Planck tests, ingest data at IPAC and attempt to run those parts of ERCSC software that exist

  9. Near-Term Plans FY 06 • Obtain all-frequency simulated images with point sources • Simulations underway by three groups • JPL/ADG (soon) all frequency, single “super” bolometer, 1 sky coverage TOD • 30 GHz done, 217 GHz part done • HFI/IAP 40-day TOD simulations for ERCSC and data transfer test • In conjunction with DM test • LFI TBD for data transfer test • Evaluate point source detection/ extraction algorithms (Pearson) • Get HFI/LFI L2 (DM) Preprocessing codes running locally • Test data ingestion using data from HFI/LFI transfer tests • Get USPDC Unified Mapping/ GCP/ M3 code running locally • Generate maps for detection/ extraction testing

  10. ERCSC Milestones • Inter-related milestones related to data transfer and handling are in separate schedule. 5/06 Obtain level S simulations all frequencies 6/06 Install PISTOU, kst 5/06 Obtain predicted source catalog for all frequencies 2/07 Predict ERCSC contents (number, type of sources; simulated source lists) 7/07 Run Level S data through L2 testbed 12/06 Select detect/extract option 4/07 Build detect/extract modules 7/07 Integrate datect/extract with IPAC implementations of L2 pipelines 12/07 Run detect/extract on simulated data 3/08 Design/develop Catalog generator 5/08 Integrate Catalog generator with US DPC pipeline 7/08 Run Level S data through Catalog generator 7/07 Design/develop QA tool 4/08 Design/modify/develop Bandmerger for QA 5/08 Bandmerge on simulated source lists 5/08 Develop ability to assicate extractions with prior catalogs 9/08 Test ability to associate extracted sources with prior sources (C&R)

  11. Source Extraction Tim Pearson

  12. ERCSC Source Extraction (1/3) • Locate sources in single-band maps (detection) • Filter maps and search for peaks • Matched filters (with assumptions about noise statistics) • “Robust” filters (e.g., mexican hat) • Bayesian methods • Estimate source parameters (extraction) • Position, flux density • Polarization (Q, U) for bright sources • Angular size (resolved sources): best effort • Band merge (best efforts) • Cross-correlation of single-band catalogs • Parameter estimation using simultaneous fits to several bands • Spectral index • SZ cluster detection (?)

  13. ERCSC Source Extraction (2/3) • Make use of published algorithms • Development of new algorithms may occur but is not necessary to meet goals • Use code developed elsewhere when possible • Write new code as necessary • Compare algorithms using simulated sky maps and WMAP data • Compare extracted source parameters with input • As a function of source flux density • In regions with different foreground contamination • Estimate reliability and completeness • Reliability: fraction of sources found that are real • Completeness: fraction of real sources that are found • Estimate parameter accuracy • Positional accuracy • Photometric accuracy • Goal: choose one algorithm that meets requirements by mid 2007 • Balance criteria of reliability, completeness, accuracy, run time • Implement full pipeline and test with simulated data • Further development of algorithms prior to launch

  14. ERCSC Source Extraction (3/3)Proposed Algorithms • Matched filter • Tegmark & de Oliveira-Costa 1998; Vio et al. • Mexican hat wavelet • Cayón et al. 2000; Vielva et al. 2001, 2003 • Neyman-Pearson detector and biparametric scale adaptive filter • López-Caniego et al. 2005 • Adaptive top-hat filter • Chiang et al. 2002 • Bayesian approach • Hobson & McClachlan 2003 • Savage & Oliver 2005 • MOPEX • Makovoz & Marleau 2005 • Sextractor • Bertin & Arnouts 1996 (blended sources) • etc.

  15. US Planck Data Center Archive at IPAC Bill Reach

  16. IPAC Planck Team • Bill Reach • IPAC team lead, Data center design • Gene Kopan • ERCSC system architect • Tim Pearson • Source detection and extraction algorithms • Brendan Crill • HFI instrument, lab testing, telemetry • Ben Rusholme (June 2006) • Preprocessing pipelines, QA • Graca Rocha (TBD) • Algorithms, maps, simulations • 1 more scientist being recruited • Code developer being recruited

  17. IPAC Planck Data Archive • Service for the US portion of the Planck team • Retrieve data from DPCs • IPAC is single point of contact for mission data • Start with simulations and laboratory test data before launch • Start with PV phase for flight data • Continue through end of mission • Maintain data securely • Support ERCSC development and operations • Distribute to US team members in accord with Planck policies • Generate calibrated data from raw, upon instrument model update • Service for the US comminuty • Prepare a mission archive capable of supporting NASA researchers into the future • Same products and documentation as available from ESA, reorganized for our servers but not rewritten • Limited US community support (public webpage, email helpdesk) • Deliver entire archive (“in a box”) to Infrared Science Archive (IRSA) for “eternity”

  18. IPAC Planck Archive: Operations • Import data from HFI and LFI DPCs in their native format • Includes detector data, pointing data, and instrument models • Start during PV phase • At least weekly transfers • Import data products from HFI and LFI DPCs • Calibrated time-ordered data • skymaps • Preprocess data using L2 (modified) pipelines at IPAC • Limited processing capability at IPAC, at level 2 (L2) • No processing for US team after 2011 • Algorithm development support for US team ideas that need to feed back to DPCs • Export preprocessed data to NERSC for mapmaking • Support secure transfer to NERSC • Consultation support for installing DPC software at NERSC

  19. Secure storage • Access to US co-Is as per DPC policies • Calibrate (apply instrument model) • Deglitch (flagging) Calibrate (apply instrument model) • Deglitch (flagging) • high-pass filter (optional) • Assign pointing to samples • Destripe (Springtide) • Assign samples to pixels • Coadd into maps IPAC Planck Processing LFI DPC HFI DPC FTP FTP LFIraw and L2 PIOremote And L2 IPAC Planck Data Archive HFI L2 Preprocessing LFI Preprocessing USDPC Destriping / Mapmaking ERCSC Source Detection / Extraction ERCSC QA/ Catalog Production ERCSC

  20. IPAC/PlanckComputing

  21. Phase 1 (FY06) Phase 2 (FY07) IPAC/PlanckComputing

  22. Data Archive: Milestones 6/06 Install PIOlib 8/06 Compile individual L2 module 7/07 Integrate L2 modules into testbed 8/06 Negotiate data transfer format with HFI DPC 8/06 Negotiate data transfer format with LFI DPC 6/06 Obtain simulated data from HFI DPC 6/07 Obtain flightlike data from HFI DPC 9/06 Obtain sample data from LFI DPC 9/07 Obtain flightlike data from LFI DPC 9/06 Develop data security plan 12/07 Design prototype archive interface for US co-Is 4/07 Integrate L2 modules onto US PDC pipeline 9/07 Run L2 pipeline on testbed with HFI data 9/07 Run LFI-L2 pipeline on testbed with LFI data 10/07 Run HFI and LFI data through Unified mapmaking code 6/06 Install 20 TB disk, linux servers (8 node) 6/07 Install 40 TB disk, linux servers (24 node)

  23. Planck/IPAC current Schedule

More Related