1 / 41

Jason Surace Russ Laher , Frank Masci , Wei Mi (did the IPAC work)

The Palomar Transient Factory or Adventures in High Fidelity Rapid Turnaround Data Processing at IPAC. Jason Surace Russ Laher , Frank Masci , Wei Mi (did the IPAC work) Branamir Sesar , Eran Ofek , David Levitan (students & post-docs)

Download Presentation

Jason Surace Russ Laher , Frank Masci , Wei Mi (did the IPAC work)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Palomar Transient Factory orAdventures in High Fidelity Rapid TurnaroundData Processing at IPAC Jason Surace Russ Laher, Frank Masci, Wei Mi (did the IPAC work) BranamirSesar, EranOfek, David Levitan (students & post-docs) Vandana Desai, Carl Grillmair, Steve Groom, EugeanHacopians, George Helou, Ed Jackson, Lisa Storrie-Lombardi, Lin Yan (IPAC Team) Eric Bellm (Project Scientist), ShriKulkarni (PI)

  2. What was/is PTF/iPTF? • PTF is a robotic synoptic sky survey system designed to study transient (time-domain) phenomena. • Surveys 1000-3000 square degrees a night, predominantly at R-band to a depth of 20.5. • Primarily aimed at supernova science. • But also can study variable stars, exoplanets, asteroids, etc. • And produces an imaging sky survey like SDSS over larger area. • PTF ran 4 years on-sky starting in 2009, now “iPTF” for another 3. Early foray into the next big theme in astronomy. • Total budget ~$3M.

  3. Former CFHT 12k Camera -> PTF Camera Eliminated nitrogen dewar; camera now mechanically cryo-cooled. New field flattener, etc. 7.8 square degree active area. Surace 2014

  4. The Venerable 48-inch Telescope

  5. PTF camera installed in late 2008; Operations started 2009 Fully robotic operation. Automatically opens, takes calibrations, science data, and adapts to weather closures. Human intervention used to guide science programs.

  6. Infrared Processing and Analysis Center IPAC is NASA’s multi-mission science center and data archive center for IR/submm astronomy. Specifically, we handle processing, archiving, and/or control for numerous missions including: IRAS, ISO, Spitzer, GALEX, Herschel, Planck, and WISE, as well as 2MASS, KI, and PTI. Also the seat of the Spitzer Science Center, NExSci, NED, NStED, and IRSA. Approximately 150 employees in two buildings on the CIT campus. Surace 2014

  7. R-band Holdings 1292 nights, 3.1 million images 47 billion source apparitions (epochal detections)

  8. g-band Holdings 241 nights, 500 thousand images

  9. H-alpha Holdings 99 nights, 125 thousand images

  10. P48 NERSC Image Subtraction and Transient Detection/RB Pipeline Caltech/Cahill IPAC Realtime Image Subtraction Pipeline Transient Candidates Ingest Moving Object Pipeline SSOs Reference Images Epochal Images and Catalogs Photometric Pipeline Reference Catalogs Lightcurves Lightcurve Pipeline Reference Pipeline

  11. IPAC Infrastructure • Data transmission from Palomar via microwave link to SDSC. • ~1TB of data every 4-5 days. • 24 drones with 240 cores. Mixed Sun and Dell blade units running RHE. • Roughly 0.5 PB spinning disk in Nexsan storage units. • Associated network equipment. • Database and file servers. • Archive servers. • Tape backup. IPAC Morrisroe Computer Center

  12. Cluster/Parallelization Architecture • PTF data are observed on a fixed system of spatial tiles on the sky. Vastly simplifies data organization and processing. PTF fields and CCD combinations are the basic unit to parallelize processing over multiple cluster nodes. Each node processes a CCD at a time. • “Virtual Pipeline Operator” on a master control node oversees job coordination and staging. • Multi-tiered local scratch disk, “sandbox” (working area) and archive disk structure; inherited architecture from previous projects driven by issues with very large file counts and I/O heavy processes. • Disk system shared with archive for budget constraint issues.

  13. Software Structure • Individual modules written predominantly in C, but also FORTRAN, PYTHON, MATLAB, and IDL. • Connected with PERL wrapper infrastructure into discrete pipelines. • Postgres database used for tracking dataflow, data quality, etc. Relational database not used in the operations system for catalog storage; not needed, and flat file access is more efficient. • Heavy use of community software: sextractor, swarp, scamp, astrometry.net, daophot, hotpants. Cheaper not to re-invent the wheel. • Software replaced as needed by new code development. • Highly agile development program: unknown and changing science requirements, small team, and no separate development system due to budget constraints! • Continuous refinement process. There’s a trap with big data development on a new instrument.

  14. Realtime Pipeline • Realtime – data is processed as received, turnaround in 20 minutes. Needed for same-night followup. • Astrometric and photometrically calibrated. • Image subtraction against a reference image library constructed from all the data to-date. In-house software. • “Streak detection” for fast-moving objects; moving object pipeline constructs solar system object tracklets. • Transient candidate detection and extraction via psf-fitting and aperture extraction. • Machine-learning “scores” candidates. • Image subtractions and candidate catalogs are pushed to an external gateway where they are picked up by the solar system, ToO, and extragalactic marshalls.

  15. Realtime Image Subtraction and Transient Detection Originally the community “HOTPANTS” package, now replaced with a more sophisticated in-house image subtraction algorithm.

  16. Photometric Pipeline • This pipeline processes data in the traditional manner. • Starts up at the end of the night, after all the data has been received. • Calibration is derived from the entire night’s worth of data. Specifically, the bias and flat-fields are derived from the data themselves. • Photometric calibration is derived from extracted photometry from all sources, fitting color, extinction, time and large-scale spatial variations vs. the SDSS. Typically reach an accuracy of a few %. • Astrometric calibration is done individually at the CCD level, against a combined SDSS and UCAC4 catalog. Typically good to 0.15”. • Output from this pipeline are calibrated single-CCD FITS images and single-CCD catalog FITS binary tables (both aperture and psf-fit). These are archived through IRSA. Available 1-3 days after observation.

  17. Photometric Pipeline Output Single R-band thumbnail image of Arp 220, 8 arcminutes across. Aperture extractions catalog (sextractor-based) overlaid. All observations and detections of everything are saved in the archive. Products are a reduced image, bit-encoded data quality mask, and catalogs. All products are FITS.

  18. Reference Image Pipeline • Once enough individual observations accumulate, the “reference image” pipeline is triggered. • This pipeline coadds the existing data, after selecting “best frames”, e.g. best seeing, photometric conditions, astrometry, etc. • Coaddition is done based on CCD id, PTF tile, and filter. • These images are the reference of the static sky, at a level deeper than the individual observations. • “Reference Catalogs” are extracted from these images. • This concept is important, because these are both the underlying basis of the image subtractions,and also the basis of the light-curve pipeline. • Like PTF coverage, the depth of these is variable, but is current 5<n<50. • Resulting products are FITS images and FITS binary tables.

  19. Reference Images Single Image 60 sec @R Field 5257, Chip 7, Stack of 34

  20. Deep Sky Coadds aka “Reference Images” * Results not typical. Near Galactic Center.

  21. Deep Coadds

  22. Light Curve Pipeline • Each night, all detected sources from the photometric pipeline are matched against the reference catalog (better than a generic catalog-matching approach). • All sources ever seen for a given CCD, PTF tile, and filter combination are loaded and analyzed. • Least variable sources used as anchors for the calibration. • Image-by-image correction factors computed for that image as a whole and stored as a lookup table. • Application of these secondary correction factors improves overall relative calibration to near-millimag levels for bright sources (that part is important). • Triggers less frequently (planned weekly updates). • Highest level of our products.

  23. Binary star light curves taken from PTF processed images in Orion. From Van Eyken

  24. Example Light Curves Something a little different, these are relatively faint asteroid light curves from Chang et al. 2014.

  25. PTF Archive at IRSA Data products can be searched and retrieved via sophisticated GUI tools and also through an application program interface that allows integration of the archive into other, 3rd party software.

  26. PTF Archive at IRSA IRSA is looking to hire a UI software developer , see the Caltech website https://jobs.caltech.edu/postings/2254or ask Steve Groom at this meeting.

  27. PTF “Marshals” • PTF “Science Marshals” sit on top of the data archive. • Marshals are like interactive science twikis. • Marshals are predominantly written by science users for their science collaborations, with coordinated interaction between them and the ops/archive system. • The ops system produces science products (e.g. data), the archive produces access to science products, the marshals help turn the science products into science results (e.g. papers). • They can be used to classify data, listen for alerts, lay down new observations for robotic followup, coordinate collaborators, etc.

  28. iPTF Extragalactic Marshal

  29. iPTF Extragalactic Marshal

  30. NEA “Streaker” Marshal

  31. NEA “Streaker” Marshal

  32. GRB Target of Opportunity (ToO) Marshall GRBs and (should they ever be detected) gravity waves can only be localized to tens to a few hundred square degrees. PTF and ZTF can survey these areas in tens of minutes as targets of opportunity to localize fading electromagnetic counterparts. Marshall receives alerts from Fermi and Swift, automatically lays down proposed ToO observations, and alerts a user by phone to activate the followup. iPTFToO Marshall iPhone App

  33. Zwicky Transient Facility More or less what PTF was, but an order of magnitude more of it. ZTF was awarded full funding through NSF-MSIP (Mid-Scale Innovation Program). ZTF now a roughly 50:50 public:private partnership. Total Budget ~$17M

  34. Wafer-Scale CCDs e2v CCD231-C6 6k x 6k form factor with 15 micron pixels. A little under 4 inches on a side. Focal plane readout time <10 seconds! 16 CCDs, 4 readouts each. And they are cheap. 30-second cadence means 1.2 GB raw data every 45 seconds. ~16x current data rate from PTF. 5 CCDs in-hand, remaining 11 now ordered.

  35. ZTF camera FOV is 50 square degrees. Largest camera on >1m telescope by area in the world. Or, to make a little clearer, here’s Orion. The white box is the ZTF imaging area. The moon is in the upper right corner of the white box.

  36. And to Process All This? IPAC is the data processing and archive centerfor all aspects of ZTF. Continuous raw data flow of 30MB/s. 0.5-1 PB/yr of data products. Drone farm of 128 computers. Replication of proven PTF design in subunits similar to PTF data load (camera quadrants).

  37. Transient Science Summer Schools

  38. Schedule • Early 2014 – PTF data for selected high cadence fields (M81, Beehive, Orion, Kepler, Stripe 82, Cass-A. • 2015 – Complete PTF Archive release. • 2016 – Rolling Releases of iPTFArchive, , including deep reference images and light curves. • 2017 – ZTF First Light (Jan), commissioning of camera, building of new reference images. • 2018 – First ZTF data release (images, catalogs, light curves, transient candidates) • 2019 – Release of transient alerts. • 2020 – NSF funded period ends. Project continues with private partners.

  39. http://ptf.caltech.edu

More Related