PLATO Data Center: Purpose and Structure. Laurent Gizon (PDPM) Hamed Moradi (PDC Project Office). The PDC is in charge of the validation, calibration, and analysis of the PLATO observations. It delivers the final PLATO science Data Products. SGS Structure.
Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.
Laurent Gizon (PDPM)
Hamed Moradi (PDC Project Office)
The PDC is in charge of the validation, calibration, and analysis of the PLATO observations.
It delivers the final PLATO science Data Products.
Mission Operations Center (MOC, flight-critical)
Science Operations Center (SOC, mission-critical)
PLATO Data Center (PDC, science-critical)
Science Preparatory Activities (scientific specification of software)
PLATO Ground Segment analysis of the PLATO observations.
Operations Ground Segment
Science Ground Segment
Mission Operations Centre
Science Operations Centre
PLATO Data Center
PLATO Science Preparation Management
Ground Station Network
Telemetry: Baseline:109 Gb/day uncompressed (8.7 Mb/s compressed, during 3.5 hr each day).
Level 0: Depacketized light curves, centroid curves, and selected imagettes (~1600), for each telescope (32+2)
Level 1: Analysis of imagettes to validate and optimize performance of on-board treatment. Implementation of on-ground instrumental corrections, such as CCD corrections and jitter corrections. Then computation of average light curves and centroid curves for each star (science-ready).
Level 2: PLATO science Data Products (next table). Final DP is a list of confirmed planetary systems, fully characterized by the transit curves, the stellar seismic parameters, and the follow-up observations. High scientific added value.
Essential information for the success of the mission: input catalog, follow-up observations, etc.
Support for on-board processing, on-ground calibration, and scientific data analysis
Stellar properties: effective temperature, absolute luminosity, radius [Gaia], chemical abundances, v sin i, activity, properties specific to multiple stars.
Follow-up observations to confirm planets (at several wavelengths when possible)
Other relevant complementary observations: hires spectra, astrometry, imaging, spectro-polarimetry, etc..
The ancillary data are in support of the processing activities and are accessed by the PDC via the main data base.
Ancillary observations: the technical (PDC) and scientific (PSPM) requirements baseline for the SGS and to develop the operations concept, architecture and interfaces.
PDC Architecture the technical (PDC) and scientific (PSPM) requirements baseline for the SGS and to develop the operations concept, architecture and interfaces.
data access the technical (PDC) and scientific (PSPM) requirements baseline for the SGS and to develop the operations concept, architecture and interfaces.
L. Gizon, MPS
main database &
R. Burston, MPS
I. Pardowitz, MPS
N. Walton, IoA
R. Burston, MPS
SOC includes a processing center for the validation and calibration of the data
ESA overall coordination (oversight) of science data releases, data access
PDC designs and implements software to be run at the SOC
R. Samadi, LESIA
T. Appourchaux, IAS
PDC WBS the technical (PDC) and scientific (PSPM) requirements baseline for the SGS and to develop the operations concept, architecture and interfaces.
1 Central Data Base
5 Data Processing Centers
WP32 Data Processing Algorithms the technical (PDC) and scientific (PSPM) requirements baseline for the SGS and to develop the operations concept, architecture and interfaces. (Talk by Samadi)WP35 Ancillary Data Management(Talk by Deleuil)WP36 Exoplanet Analysis System(Talk by Walton)WP37 Stellar Analysis System(Talk by Appourchaux)
System architecture, archives, data base, system management
Data flow design and management, export system, network
Simulation of data stream
Write and implement core-processing software that will run at the SOC
Requires a good understanding of system interfaces with SOC and operational procedures
For phase A, study jitter correction to prove feasibility
Regular meetings with ESA to specify interfaces PDC-SOC
End Feb 2011 5th PDC Meeting in K-Lindau. Identify final problems. Invite PSPM Leaders.
WP Leaders deliver reports to LG by March 2011
PDC document delivered to PCL in May 2011
June 2011: Decision on PLATO selection
End June 2011, Phase B1 meeting.
December 2011: End phase B1
November 2018: Launch of PLATO
3+2+1 years in space
Several releases of DPs during and after space mission
PDC must remain operational up to ~3 yrs after the end of the space mission in order to confirm last planets.
Validate onboard software:
Check onboard processing using ground copy of onboard software and the imagettes of ~1600 stars
Validate distortion matrix model, 2D sky background model, PSF model fits
Validate computation of masks and windows
Validate onboard setup:
Fine tuning of onboard software algorithm. For example choose number of parameters needed to describe PSF. Especially during configuration mode.
Monitor health of each telescope and assess quality of the data
Correction for jitter. Performed independently for each telescope; requires PSF knowledge, stellar catalog, and distortion matrix.
Integration time correction, sampling time correction
Statistical analysis over the 40 telescopes to identify cosmic ray hits, hot pixels, and possibly deficient telescopes
Average light curves and centroid curves over all telescopes (weighted average).
Compute error based on scatter
The ~1600 stars for which imagettes are available receive a more sophisticated treatment. PSF fits to improve photometry (contamination from neighboring sources taken into acount). Imagettes are downloaded for all stars for which a serious planetary candidate has been identified.
Long term detrending probably moved to PDC
Telemetry rate: 109 Gb/day uncompressed
Over a 6 yr mission: 30 TB uncompressed
The volume of archived L0, L1 and HK data is expected to be 10-50 times this amount (reformatting and calibration history), i.e. 300-1500 TB
The volume of the science data products is likely to be negligible in comparison (although the complexity of the data may be high).
Ancillary data base: basic stellar observations and parameters, spectra, Gaia specific obs, etc. How big?
The overall data volume should not exceed a few PB, which is not problematic.