Difference between revisions of "The HFI DPC"

From Planck Legacy Archive Wiki
Jump to: navigation, search
m
 
(78 intermediate revisions by 8 users not shown)
Line 1: Line 1:
''This text written to experiment with this wiki''
+
The first stage of HFI data processing is performed on-board in order to generate the telemetry, as described in the [[HFI_detection_chain#Data_compression | Data compression]] section. On the ground, the HFI DPC has been organized into different "Levels": 1, 2, 3, 4, and "S". In brief, during operations, L1 feeds the data base, resulting in time-ordered information (TOI) objects. L2 is the core of the processing, which turns TOIs into clean calibrated sky maps. L3 transforms these maps at specific frequencies into more scientific products, like catalogues, maps, and spectra of astrophysical components. L3 can rely on simulation provided by the LS, while L4 refers to delivering the DPC products to ESA.  This processing relies on dedicated software and hardware infrastructures developed pre-launch.
  
Now I want to create a [[page1 | page1]]
+
The data processing applied for the ''"Early Planck results"'' series of publications was described in {{PlanckPapers|planck2011-1-7}}. The {{PlanckPapers|planck2013-p03}} and its companion papers provide the reference for the processing done for the 2013 data release.
  
The data is processed in two main steps:
+
<span style="font-size:150%">'''Level 1: building the reference data base during flight operations''' </span>
* L1: which consists in receiving the telemetry and ancillary data files and ingesting them into the DPC database.  This involves decompressing, in some cases changing data formats, computing the time of individual data samples from the time of the compression slices, but otherwise no processing proper.
 
* L2: this is where the data are processed. The main processing steps are
 
** TOI processing, which includes conversion from ADUs to engineering units (Volts), demodulation, deglitching, conversion from engineering to physical units (Watts), removal of known systematic effects (non-linearities, 4K lines, Jumps, ring flagging), removal of the instrumental signature (time transfer function)
 
** map-making: projecting the TOIs onto all-sky maps, etc. etc,
 
  
[[ Selected Systematic Effects | Selected Systematic Effects ]]:
+
(L1): consists in receiving the telemetry and ancillary data files and ingesting them into the DPC data base.  This involves decompressing, in some cases changing data formats, computing the time of individual data samples from the time of the compression slices, but otherwise no processing proper. Other steps are:
Noise
+
* science, housekeeping, and ancillary data ingestion;
* Glitches (Patanchon): See the papers. More information can be found [[Glitches | here]].
+
* timing and pointing interpolation.
* Basic, Gaussian Noise (TOI) (Desert/Moneti)
+
This is further described in the [[Pre-processing]] section.
* Glitching (Patanchon/Renault)  
 
** Elephants Giraud-Heraud/Puget
 
* Thermal Fluctuations
 
** 1.6K & 4K Catalano/Desert
 
* Popcorn Noise Sanselme
 
** Baseline Jumps Sanselme/Kharab
 
** Split Level Noise X We need this! Sanselme/Kharab To be included by Lilan indeed.
 
* 4K Lines X X ? X Macias This page sums up the inflight results First Analysis on-going, results soon
 
** Vibration X Pajot Linked to 4K line, but broader
 
After depointings Prunet/Colley link to Prac logBook
 
Noise Correlations X X Jean-Marc Delouis This should be studied before putting them in Monte Carlos. In Sisyphe, all bolometers are run independently. Cecile implemented a correlated Glitch model in Desire for PSB.
 
* Ring Offsets (is there any issue here?) Xavier doesn't believe there is any issue. This is on hold until we're sure that there is actually an issue.
 
* Time Sampling X Vibert/Montier We want to document clock rate difference between Pre- and in-launch, as well as note the variations we had with un-temperature-controlled clocks
 
** CMB + Galaxy Calibration Perdereau/Lagache
 
** Planets Fluxes Gudmundsson/Lagache
 
** Point Source Fluxes Reijo/Loic/Rosset
 
** VI Curves Holmes/Desert
 
* Known gain variation with bolometer plate temperature Catalano/Desert
 
* Saturation Lamarre/Catalano/Coulais
 
* ADC Correction Couchot/Patanchon Not included in Sisyphe, because we know we want to put it in the Monte Carlos...\\ [AS] DNL of cold raw periods for all chans dnl-cold-all.pdf
 
* Compression Couchot/Plasczyniski
 
* Band Pass Spencer/Macias/Pajot
 
* Cross Talk (PSB Only?) Montier
 
* Beam
 
** Main Jones/Roudier
 
** Dimpling Oxborrow
 
** Ruze Gudmundsson
 
** FSL Murphy/Ganga
 
* Time Response Crill/Haissinski/Lamarre
 
* Focal Plane  Jaffe
 
* Pointing Benabed/Jaffe
 
o Cross-Scan Sampling
 
  
[[Category:Data processing]]
+
<span style="font-size:150%">'''Level 2: converting temporal information into clean calibrated maps'''</span>
 +
 
 +
(L2): this is where the data are processed from timelines into maps. The main processing steps are:
 +
* timeline (or Time-Ordered Information = TOI) processing, which includes conversion from ADUs to engineering units (volts), demodulation, deglitching, conversion from engineering to physical units (watts), removal of known systematic effects (non-linearities, 4-K lines, jumps, ring flagging), removal of the instrumental signature (time transfer function), and temporal noise estimation (see section [[TOI_processing]]);
 +
* pointing and beam construction of each detector (see sections [[Detector pointing]] and [[Beams]]);
 +
* mapmaking and photometric calibration, projecting the TOIs onto all-sky maps, etc (see section [[Map-making]]);
 +
* characterization/validation through angular power spectra (see section [[PowerSpectra]]);
 +
* overal HFI data validation, through difference tests, comparison to detailed simulations, etc. (see section [[HFI-Validation]]).
 +
 
 +
The resulting data characteristics are given in the [[Summary]] section.
 +
 
 +
<span style="font-size:150%">'''Level 3: basic analyses of (Level 2) sky temperature maps'''</span>
 +
 
 +
(L3): This is where the data in the form of frequency maps are converted to catalogues and full-sky astrophysical component maps. Much of this is done in common with the LFI DPC and is further described in the [[HFI/LFI joint data processing | HFI/LFI common sections]].
 +
 
 +
<span style="font-size:150%">'''Level S : a common HFI/LFI simulation software'''</span>
 +
 
 +
Level S is the so-called "Simulation Level"  software suite, common to both consortia, which, given a sky model (generated by the Planck sky model, <tt>PSM</tt>), detectors pointing and beams, generates the power falling on each detector. It can also provide a simplified description of the noise. It is further described in the HFI/LFI [[HFI/LFI joint data processing| common section]]. HFI specific developments (configuration control and MC bench, specific effects like 4-K lines, glitches, ADC nonlinearity, etc.) are described in [[HFI-Validation | the HFI data validation section]].
 +
 
 +
<span style="font-size:150%">'''HFI DPC Infrastructures'''</span>
 +
 
 +
The HFI Data Processing Centre can be thought of as a centralized backbone, providing hardware and software infrastructure to a relatively large number of geographically distributed groups of developers and other research groups in the HFI and LFI core teams. An overview was given in {{PlanckPapers|planck2011-1-7}}.  Particular tasks include:
 +
* code and configuration management;
 +
* data management;
 +
* instrument model (IMO) data base;
 +
* data flow management;
 +
* hardware.
 +
 
 +
== References ==
 +
<References />
 +
 +
 
 +
 
 +
[[Image:HFI_logo_H.png]]
 +
 
 +
 
 +
[[Category:HFI data processing|000]]

Latest revision as of 01:28, 23 May 2015

The first stage of HFI data processing is performed on-board in order to generate the telemetry, as described in the Data compression section. On the ground, the HFI DPC has been organized into different "Levels": 1, 2, 3, 4, and "S". In brief, during operations, L1 feeds the data base, resulting in time-ordered information (TOI) objects. L2 is the core of the processing, which turns TOIs into clean calibrated sky maps. L3 transforms these maps at specific frequencies into more scientific products, like catalogues, maps, and spectra of astrophysical components. L3 can rely on simulation provided by the LS, while L4 refers to delivering the DPC products to ESA. This processing relies on dedicated software and hardware infrastructures developed pre-launch.

The data processing applied for the "Early Planck results" series of publications was described in Planck-Early-VI[1]. The Planck-2013-VI[2] and its companion papers provide the reference for the processing done for the 2013 data release.

Level 1: building the reference data base during flight operations

(L1): consists in receiving the telemetry and ancillary data files and ingesting them into the DPC data base. This involves decompressing, in some cases changing data formats, computing the time of individual data samples from the time of the compression slices, but otherwise no processing proper. Other steps are:

  • science, housekeeping, and ancillary data ingestion;
  • timing and pointing interpolation.

This is further described in the Pre-processing section.

Level 2: converting temporal information into clean calibrated maps

(L2): this is where the data are processed from timelines into maps. The main processing steps are:

  • timeline (or Time-Ordered Information = TOI) processing, which includes conversion from ADUs to engineering units (volts), demodulation, deglitching, conversion from engineering to physical units (watts), removal of known systematic effects (non-linearities, 4-K lines, jumps, ring flagging), removal of the instrumental signature (time transfer function), and temporal noise estimation (see section TOI_processing);
  • pointing and beam construction of each detector (see sections Detector pointing and Beams);
  • mapmaking and photometric calibration, projecting the TOIs onto all-sky maps, etc (see section Map-making);
  • characterization/validation through angular power spectra (see section PowerSpectra);
  • overal HFI data validation, through difference tests, comparison to detailed simulations, etc. (see section HFI-Validation).

The resulting data characteristics are given in the Summary section.

Level 3: basic analyses of (Level 2) sky temperature maps

(L3): This is where the data in the form of frequency maps are converted to catalogues and full-sky astrophysical component maps. Much of this is done in common with the LFI DPC and is further described in the HFI/LFI common sections.

Level S : a common HFI/LFI simulation software

Level S is the so-called "Simulation Level" software suite, common to both consortia, which, given a sky model (generated by the Planck sky model, PSM), detectors pointing and beams, generates the power falling on each detector. It can also provide a simplified description of the noise. It is further described in the HFI/LFI common section. HFI specific developments (configuration control and MC bench, specific effects like 4-K lines, glitches, ADC nonlinearity, etc.) are described in the HFI data validation section.

HFI DPC Infrastructures

The HFI Data Processing Centre can be thought of as a centralized backbone, providing hardware and software infrastructure to a relatively large number of geographically distributed groups of developers and other research groups in the HFI and LFI core teams. An overview was given in Planck-Early-VI[1]. Particular tasks include:

  • code and configuration management;
  • data management;
  • instrument model (IMO) data base;
  • data flow management;
  • hardware.

References[edit]


HFI logo H.png

(Planck) High Frequency Instrument

Data Processing Center

European Space Agency

(Planck) Low Frequency Instrument

Planck Sky Model

analog to digital converter