Difference between revisions of "The HFI DPC"

From Planck PLA Wiki
Jump to: navigation, search
 
(21 intermediate revisions by 4 users not shown)
Line 1: Line 1:
---------
+
The first stage of HFI data processing is performed on-board in order to generate the telemetry as described in the [[HFI_detection_chain#Data_compression | Data compression]] section. On the ground, the HFI DPC has been organized into different "Levels": 1, 2, 3, 4 and "S". In brief, during operations, L1 feeds the database resulting in time-ordered information (TOI) objects. L2 is the core of the processing, which turns TOIs into clean calibrated sky maps. L3 transforms these maps at specific frequencies into more scientific products, like catalogues, maps and spectra of astrophysical components. L3 can rely on simulation provided by the LS, while L4 refers to delivering the DPC products to ESA.  This processing relies on dedicated software and hardware infrastructures developed pre-launch.
<br>
 
<span style="font-size:150%">'''Contents of this chapter'''</span>
 
 
 
<br>
 
; [[The_HFI_DPC#overview|Overview]]
 
 
 
; [[Pre-processing| Pre-processing]]: ''[[Pre-processing#Overview|Overview]] • [[Pre-processing#Telemetry_data|Telemetry data]] • [[Pre-processing#Pointing_data|Pointing data]] • [[Pre-processing#Orbit_data|Orbit data]] • [[Pre-processing#Time_correlation_data|Time correlation data]]''
 
; [[TOI_processing| TOI processing]]: ''[[TOI_processing#Overview|Overview]] • [[TOI_processing#Input_TOI|Input TOI]] • [[TOI_processing#General_Pipeline_Structure|General Pipeline Structure]] • [[TOI_processing#Output_TOIs_and_products|Output TOIs and products]] • [[TOI_processing#Examples_of_clean_TOIs|Examples of clean TOIs]] • [[TOI_processing#Trends_in_the_output_processing_variables|Trends in the output processing variables]] • [[TOI_processing#Flag_description|Flag description]] ''
 
; [[Pointing&Beams| Pointing&Beams]]: ''[[Pointing&Beams#Detector_Pointing|Detector Pointing]] • [[Pointing&Beams#Scanning_Beams|Scanning Beams]] • [[Pointing&Beams#Effective_Beams|Effective Beams]]''
 
; [[Map-making| Map-Making and photometric calibration]]: ''[[Map-making#Introduction|Introduction]] • [[Map-making#Photometric_calibration|Photometric calibration]] • [[Map-making#Building_of_Maps|Building of Maps]] • [[Map-making#Noise_properties|Noise properties]] •  [[Map-making#Zodi_correction|Zodi correction]] • [[Map-making#Far_Sidelobe_Correction|Far Sidelobe Correction]] • [[Map-making#CO_Correction|CO Correction]] • [[Map-making#Map_validation|Map validation]]''
 
; [[HFI-UcCC| Spectral response]]: TBW
 
; [[HFI-Validation| Internal overall validation]]: ''[[HFI-Validation#Expected_systematics_and_tests_(bottom-up approach)|Expected systematics and tests (bottom-up approach)]] • [[HFI-Validation#Generic_approach_to_systematics|Generic approach to systematics]] • [[HFI-Validation#HFI_simulations|HFI simulations]] • [[HFI-Validation#Simulations_versus_data|Simulations versus data]] • [[HFI-Validation#Systematics_Impact_Estimates|Systematics Impact Estimates]]''
 
; [[PowerSpectra| Power spectra]]: TBW
 
; [[Summary| Summary of HFI data characteristics]]
 
 
 
<br>
 
 
 
---------
 
 
 
<br>
 
 
 
<span id="overview" style="font-size:200%">'''Overview'''</span>
 
 
 
The first stage of HFI data processing is performed on-board in order to generate the telemetry as described in [[HFI_detection_chain#Data_compression | this page]]. On the ground, the HFI DPC has been organized into different "Levels": 1, 2, 3, 4 and "S". In brief, during operations, L1 feeds the database resulting in time-ordered information (TOI) objects. L2 is the core of the processing, which turns TOIs into clean calibrated sky maps. L3 transforms these maps at specific frequencies into more scientific products, like catalogues, maps and spectra of astrophysical components. L3 can rely on simulation provided by the LS, while L4 refers to delivering the DPC products to ESA.  This processing relies on dedicated software and hardware infrastructures developed pre-launch.
 
 
 
The data processing applied for the ''"Early Planck results"'' series of publications was described in {{PEarly|6}}. The {{REF TO P03}} and its co-papers provide the reference for the processing done for the 2013 Data release. An Annex of [[P03]] specifies what had been done (and resulting characteristics) for the ''"Intermediate Planck results"'' series of publications.  
 
  
 +
The data processing applied for the ''"Early Planck results"'' series of publications was described in {{PlanckPapers|planck2011-1-7}}. The {{PlanckPapers|planck2013-p03}}  and its co-papers provide the reference for the processing done for the 2013 data release.
  
 
<span style="font-size:150%">'''Level 1: building the reference database during flight operations''' </span>
 
<span style="font-size:150%">'''Level 1: building the reference database during flight operations''' </span>
  
 
(L1): consists in receiving the telemetry and ancillary data files and ingesting them into the DPC database.  This involves decompressing, in some cases changing data formats, computing the time of individual data samples from the time of the compression slices, but otherwise no processing proper. Other steps are:
 
(L1): consists in receiving the telemetry and ancillary data files and ingesting them into the DPC database.  This involves decompressing, in some cases changing data formats, computing the time of individual data samples from the time of the compression slices, but otherwise no processing proper. Other steps are:
* data ingestion (science, HK, ancillary, other?)
+
* science, housekeeping and ancillary data ingestion
* construction of ToS in science data group
+
* timing and pointing interpolation
* pointing interpolation
 
* construction of other TOI and ROI objects from AHF, ...
 
 
 
 
This is further described in the [[Pre-processing]] section.
 
This is further described in the [[Pre-processing]] section.
  
Line 41: Line 13:
  
 
(L2): this is where the data are processed from timelines into maps. The main processing steps are
 
(L2): this is where the data are processed from timelines into maps. The main processing steps are
* Timeline (or Time-Ordered Information = TOI) processing, which includes conversion from ADUs to engineering units (Volts), demodulation, deglitching, conversion from engineering to physical units (Watts), removal of known systematic effects (non-linearities, 4K lines, Jumps, ring flagging), removal of the instrumental signature (time transfer function), temporal noise Estimation. Details [[TOI_processing| here]].
+
* Timeline (or Time-Ordered Information = TOI) processing, which includes conversion from ADUs to engineering units (volts), demodulation, deglitching, conversion from engineering to physical units (watts), removal of known systematic effects (non-linearities, 4K lines, Jumps, ring flagging), removal of the instrumental signature (time transfer function), temporal noise Estimation. See section [[TOI_processing]].
* Pointing and beam of each detector. See section [[Pointing%26Beams]].
+
* Pointing and beam of each detector. See sections [[Detector pointing]] and [[Beams]].
 
* map-making & photomoetric calibration: projecting the TOIs onto all-sky maps, etc. See section [[Map-making]].
 
* map-making & photomoetric calibration: projecting the TOIs onto all-sky maps, etc. See section [[Map-making]].
 
* Characterisation/validation through angular power spectra. See section [[PowerSpectra]].
 
* Characterisation/validation through angular power spectra. See section [[PowerSpectra]].
Line 48: Line 20:
 
* The resulting data characteristics are given in section [[Summary]].
 
* The resulting data characteristics are given in section [[Summary]].
  
<span style="font-size:150%">'''Level 3: Basic analyses of (Level 2) Frequency maps'''</span>
+
<span style="font-size:150%">'''Level 3: basic analyses of (Level 2) sky temperature maps'''</span>
  
(L3): This is where the data in the form of frequency maps are converted to catalogues and full sky astrophysical component maps. Much of this is done in common with the LFI DPC, and is further described in the [[HL | HFI/LFI common sections ]].
+
(L3): This is where the data in the form of frequency maps are converted to catalogues and full sky astrophysical component maps. Much of this is done in common with the LFI DPC, and is further described in the [[HFI/LFI joint data processing | HFI/LFI common sections ]].
  
<span style="font-size:150%">'''Level 4 : Delivering results'''</span>
+
<span style="font-size:150%">'''Level S : a common HFI/LFI simulation software'''</span>
  
Level 4 is the "Archive Level". No processing done, but rather exporting, reformating, documenting.  
+
Level S is the so-called "Simulation Level" software suite common to both consortia, which, given a sky model (generated by the Planck sky model, <tt>PSM</tt>), detectors pointing and beams, generates the infalling power on each detector. It can also provide a simplified description of eg. the noise. It is further described in the HFI/LFI [[HFI/LFI joint data processing| common section]]. HFI specific developments (configuration control & MC bench, specific effects like 4K lines, glitches, ADC non-linearity, etc.) are described in [[HFI-Validation | the HFI data validation section]].
  
<span style="font-size:150%">'''Level S : A common HFI/LFI simulation software'''</span>
+
<span style="font-size:150%">'''HFI DPC Infrastructures'''</span>
  
Level S is the so-called "Simulation Level"  software suite common to both consortia, which, given a sky model (generated by the Planck sky model, <tt>PSM</tt>), detectors pointing and beams, generates the infalling power on each detector. It can also provide a simplified description of eg. the noise. It is further described in the HFI/LFI [[ HL | common section]]. HFI specific developments (configuration control & MC bench, specific effects like 4K lines, glitches, ADC non-linearity, etc.) are described in [[HFI-Validation | the HFI data validation section]].
+
The HFI Data Processing Centre can be thought of as a centralized backbone providing hardware and software infrastructures to a relatively large number of geographically distributed groups of developers and other R&D groups in the HFI and LFI core teams. An overview was given in {{PlanckPapers|planck2011-1-7}}. In particular:
 +
* Code and configuration management,
 +
* Data management,
 +
* Instrument model (IMO) database,
 +
* Data flow management,
 +
* Hardware.
  
<span style="font-size:150%">'''HFI DPC Infrastructures'''</span>
+
== References ==
The HFI Data Processing Centre can be thought of as a centralized
+
<References />
backbone providing hardware and software infrastructures
+
to a relatively large number of geographically distributed groups
 
of developers and other R&D groups in the HFI and LFI core teams.
 
  
An overview was given in the HFI data processing paper of the ''"Planck Early Results"'' series.
 
In particular:
 
* Code and configuration management can be found at {{PEarly|6|page=34}}.
 
* Data management at {{PEarly|6|page=34}}.
 
* Instrument model (IMO) database at {{PEarly|6|page=35}}.
 
* Data flow management at {{PEarly|6|page=35}}.
 
* Hardware at {{PEarly|6|page=36}}. [[Image:HFI_IAP.png|thumb|center|240px|HFI reference platform in the IAP basement]]
 
  
 +
[[Image:HFI_logo_H.png]]
  
[[Image:HFI_logo_H.png]]
 
  
[[Category:Data processing|004]]
+
[[Category:HFI data processing|000]]

Latest revision as of 13:50, 23 July 2014

The first stage of HFI data processing is performed on-board in order to generate the telemetry as described in the Data compression section. On the ground, the HFI DPC has been organized into different "Levels": 1, 2, 3, 4 and "S". In brief, during operations, L1 feeds the database resulting in time-ordered information (TOI) objects. L2 is the core of the processing, which turns TOIs into clean calibrated sky maps. L3 transforms these maps at specific frequencies into more scientific products, like catalogues, maps and spectra of astrophysical components. L3 can rely on simulation provided by the LS, while L4 refers to delivering the DPC products to ESA. This processing relies on dedicated software and hardware infrastructures developed pre-launch.

The data processing applied for the "Early Planck results" series of publications was described in Planck-Early-VI[1]. The Planck-2013-VI[2] and its co-papers provide the reference for the processing done for the 2013 data release.

Level 1: building the reference database during flight operations

(L1): consists in receiving the telemetry and ancillary data files and ingesting them into the DPC database. This involves decompressing, in some cases changing data formats, computing the time of individual data samples from the time of the compression slices, but otherwise no processing proper. Other steps are:

  • science, housekeeping and ancillary data ingestion
  • timing and pointing interpolation

This is further described in the Pre-processing section.

Level 2: converting temporal information into clean calibrated maps

(L2): this is where the data are processed from timelines into maps. The main processing steps are

  • Timeline (or Time-Ordered Information = TOI) processing, which includes conversion from ADUs to engineering units (volts), demodulation, deglitching, conversion from engineering to physical units (watts), removal of known systematic effects (non-linearities, 4K lines, Jumps, ring flagging), removal of the instrumental signature (time transfer function), temporal noise Estimation. See section TOI_processing.
  • Pointing and beam of each detector. See sections Detector pointing and Beams.
  • map-making & photomoetric calibration: projecting the TOIs onto all-sky maps, etc. See section Map-making.
  • Characterisation/validation through angular power spectra. See section PowerSpectra.
  • Overal HFI data validation, through difference tests, comparison to detailed simulations, etc., See section HFI-Validation
  • The resulting data characteristics are given in section Summary.

Level 3: basic analyses of (Level 2) sky temperature maps

(L3): This is where the data in the form of frequency maps are converted to catalogues and full sky astrophysical component maps. Much of this is done in common with the LFI DPC, and is further described in the HFI/LFI common sections .

Level S : a common HFI/LFI simulation software

Level S is the so-called "Simulation Level" software suite common to both consortia, which, given a sky model (generated by the Planck sky model, PSM), detectors pointing and beams, generates the infalling power on each detector. It can also provide a simplified description of eg. the noise. It is further described in the HFI/LFI common section. HFI specific developments (configuration control & MC bench, specific effects like 4K lines, glitches, ADC non-linearity, etc.) are described in the HFI data validation section.

HFI DPC Infrastructures

The HFI Data Processing Centre can be thought of as a centralized backbone providing hardware and software infrastructures to a relatively large number of geographically distributed groups of developers and other R&D groups in the HFI and LFI core teams. An overview was given in Planck-Early-VI[1]. In particular:

  • Code and configuration management,
  • Data management,
  • Instrument model (IMO) database,
  • Data flow management,
  • Hardware.

References[edit]

  1. 1.01.1 Planck early results. VI. The High Frequency Instrument data processing, Planck HFI Core Team, A&A, 536, A6, (2011).
  2. Planck 2013 results: High Frequency Instrument Data Processing, Planck Collaboration 2013 VI, A&A, in press, (2014).


HFI logo H.png

(Planck) High Frequency Instrument

Data Processing Center

European Space Agency

EMI/EMC influence of the 4K cooler mechanical motion on the bolometer readout electronics.

[LFI meaning]: absolute calibration refers to the 0th order calibration for each channel, 1 single number, while the relative calibration refers to the component of the calibration that varies pointing period by pointing period.

(Planck) Low Frequency Instrument

Planck Sky Model

analog to digital converter