Last modified: 7 November 2022

URL: https://cxc.cfa.harvard.edu/ciao/caveats/hrc_dtcor.html

HRC Deadtime Corrections with Telemetry Saturation


Abstract

The original DTCOR and hence EXPOSURE values were not correct for HRC observations which experienced telemetry saturation. The hrc_dtfstats tool was updated to use a simple mean instead of a variance-weighted mean, which corrects this problem. chandra_repro run hrc_dtfstats so data that has been reclaibrated have the correct DTCOR and EXPOSURE values. All HRC data in the archive have been reprocessed to correct this; however, users with old versions of data need to study this caveat.

Of ~1800 HRC ObsIDs checked (both I and S), changing how we calculate dtcor will change the values by less than 5% for 93% of the obsids. 96% will change by less than 10%. Only about 1% of the obsids will have dtcor changed by more than 50%.


The HRC deadtime has two components, event processing time and telemetry saturation (Juda and Dobrzycki 1999). The deadtime correction factors are calculated in 2 second increments over the course of an observation using the telemetered events rate, the valid events rate, and the total events rate. These are available in the dtf1 file associated with each observation.

The CIAO tool, hrc_dtfstats, is used to compute the overall deadtime correction to the exposure time (see the CIAO thread Computing Average HRC Dead Time Corrections thread). This tool computes the variance-weighted average deadtime factor, DTCOR. DTCOR and hence EXPOSURE are not correct for observations which are subject to telemetry saturation.

For example, average count rates for Cas A, which should be a constant source, are highly variable.

HRC-I Cas A Count rates

[The count rates vary between 0.8 and 1.05 count/s/arcsec^2, with only one below a value of 0.9.]
[Print media version: The count rates vary between 0.8 and 1.05 count/s/arcsec^2, with only one below a value of 0.9.]

HRC-I Cas A Count rates

HRC-I Cas A Count Rates from level=2 event file, filtered on default GTI (but not any additional filtering) and with exposure time calculated with hrc_dtfstats - corrected DTCOR

The most egregious point, at ~50 months since 1/1/1999, is for ObsID 3698. The rate in the figure is computed with the DTCOR value from the std_dtfstat1 file, 0.972:

unix% dmlist hrcf03698_000N003_std_dtfstat1.fits"[col dtcor]"  data
---------------------------------------------------------------------
 Data for Table Block DTFSTATS
---------------------------------------------------------------------
 
 ROW    DTCOR
 
   1     0.97192718881085

A simple mean of the DTF, however, is 0.766, which, if used instead, would lead to a rate much more in line with the others.

unix%  dmstat hrcf03698_000N003_dtf1.fits"[col dtf]"
 DTF
   min: 0             @:        1
   max: 1             @:        2
  mean: 0.76564416062
  sigma: 0.270005164
   sum: 2081.0208286
  good: 2718
  null: 0

The cause of the discrepancy are the DTF errors, and in particular the relative sizes of the errors in the unsaturated vs. saturated telemetry cases. The errors during telemetry saturation are ~10 times greater than those during unsaturated intervals. Since hrc_dtfstats uses a variance-weighted average of the DTFs to compute DTCOR, the contribution of DTFs during saturated intervals is much reduced.

The hrc_dtfstats tool was updated in CIAO 4.4 to use a simple mean instead of a variance-weighted mean. Creating a new DTF file for this dataset in CIAO 4.4 returns the expected result:

unix% hrc_dtfstats hrcf03698_000N003_dtf1.fits 3698_dtf_new.fits gtifile='hrcf03698N003_evt2.fits

unix% dmlist 3698_dtf_new.fits data
 
--------------------------------------------------------------------------------
Data for Table Block DTFSTATS
--------------------------------------------------------------------------------
 
ROW    DTCOR
 
     1     0.75534347952970