Skip to content

Data Pipeline (ORAC-DR)

A large network of telescopes such as LCOGT’s that will be used for a very diverse set of scientific goals raises unique challenges that are not present in a single-purpose survey or traditional common-user facility. The large number of instruments and the volume of data they will generate means that LCOGT, as the data originator, is in the best position to understand and to reduce the data optimally. On the other hand, the wide variety of scientific programs that will be running on the network, and their diverse needs for data reduction, renders it almost impossible to make a generalized pipeline optimal for all potential science needs.

The aims of the pipeline are to do the best we can for the bulk of potential users, and making the pipeline products that are of the most general use. At the same time, we aim to avoid controversial steps in the data reduction that could be problematic for end users of the products, or to attempt to do the end-users’ science for them. In addition, the pipeline emphasizes recording of the processing steps performed, the parameters used, and the software versions employed. These steps are of vital importance for traceability of the reduced data, and to document its provenance.

The pipeline is built using the generalized infrastructure of ORAC-DR (ORAC-DR website). The pipeline is entirely data-driven and requires no user input. Processing is controlled by modular recipes defining the steps necessary to reduce the data. Each recipe is a list of data reduction steps to perform on each frame or group of frames. These individual steps are known as primitives, and each primitive performs one  astronomically-significant step such as dark subtraction or source catalog production. As most of the data reduction steps are common across classes of instruments, a small set of primitives is sufficient for the majority of processing needed in the LCOGT pipeline.

The main recipes currently in use handle the combining of raw bias, dark and flat frames into master calibration frames and the processing of regular science frames to produce the BCD (Basic Calibrated Data) products. For these science frames we perform the following operations:

  • Bad-pixel masking
  • Bias subtraction
  • Dark subtraction
  • Flat field correction
  • Astrometric solution
  • Source catalog production
  • Zeropoint determination
  • Per-object airmass and barycentric time correction computation


We perform bad-pixel masking, bias and dark subtraction, and flatfield correction in the normal manner, using the ORAC-DR calibration infrastructure to select the nearest (in time) calibration frame that satisfies the constraints of binning, filter, etc. For the astrometric solution, we use autoastrom against the UCAC3 catalog (for 1.0 m and 0.4 m data; UCAC-3 website) or the Tycho-2 catalog (for Context Camera data; Tycho-2 description).

Source catalogs are produced using the SExtractor software to perform object detection, source extraction and aperture photometry. In total, we output 49 parameters for each detected source, consisting of positional information along with its estimated errors, and information on the shape and extent and the measured flux and flux error in four fixed (at 1”, 3”, 5” & 7”) and two variable apertures.

Get Data >