Last modified: 10 December 2024

Hardness Ratio


In X-ray astronomy, a hardness ratio is the equivalent of a photometric color index, and is calculated as a normalized difference of the exposure corrected counts in two energy bands A (high energy), B (low energy). A typical definition is: \(HR = \frac{A - B}{A + B}\), but other schemes are also used.

The Chandra Source Catalog release 2 hardness ratios uses the typical scheme:

\[ HR_{xy} = \frac{F(x) - F(y)}{F(x) + F(y)} \]

where F(x) and F(y) is the aperture flux for a pair of energy bands, x and y, where x is the higher of the two bands.

The Chandra Source Catalog release 1 hardness ratios uses uses the sum of all three wide energy bands in the denominator.

\[ HR_{xy} = \frac{F(x) - F(y)}{F(s) + F(m) + F(h)} \]

where F(x) and F(y) is the aperture flux for a pair of energy bands, where x is again the higher of the two bands, and F(s), F(m), and F(h) are the aperture source photon fluxes in the soft, medium, and hard bands.

For a discussion of different definitions of hardness ratio and their biases in low counts regimes, users can consult this paper by Park et al.

The CIAO color_color tool can be used to compare the expected hardness ratio for a spectral model with varying parameters.