Last modified: December 2022

URL: https://cxc.cfa.harvard.edu/ciao/ahelp/glvary.html
AHELP for CIAO 4.15

glvary

Context: Tools::Timing

Synopsis

Search for variability using Gregory-Loredo algorithm.

Syntax

glvary  infile outfile lcfile effile [frac] [seed] [mmax] [mmin] [nbin]
[mintime] [clobber] [verbose]

Description

`glvary' implements the Gregory-Loredo variability test algorithm. It splits the events into multiple time bins and looks for significant deviations.

Input data includes an event file with good time data and a normalized effective area. Two output files are created: a table of odds ratios and a light curve file with includes +/-3 sigma curves. The odds ratio file includes information on the total odds ratio, the corresponding probability of a variable signal, the mth bin with the maximum odds ratio and the odds-weighted first moment of the mth bin, as well as the characteristic time scales for these two values. The light curve file consists of the binnings weighted by the odd ratios and shows the optimal binning for the curve. The standard deviation is provided for each point on the light curve.

The range of probabilities: 0.5 < P < 0.9 (above 0.9 all is variable and below 0.5 all is non-variable) is found to be ambiguous and therefore additional criterion is required based on the light curve, its average standard deviation, and the average count rate. The fractions f3 and f5 of the light curve that are within 3 sigma and 5 sigma respectively, of the average count rate. If f3 > 0.997 AND f5 = 1.0 for cases in the ambiguous interval, the source is considered to be non-variable. The variability index is calculated:

Table 1.1

Var. Index Condition Result
0 P <= 0.5 Definitely not variable
1 0.5 < P < 2/3 AND f3 > 0.997 AND f5 = 1.0 Considered not variable
2 2/3 <= P < 0.9 AND f3 > 0.997 AND f5 = 1.0 Probably not variable
3 0.5 <= P < 0.6 May be variable
4 0.6 <= P < 2/3 Likely to be variable
5 2/3 <= P < 0.9 Considered variable
6 0.9 <= P AND Odd < 2.0 Definitely variable
7 2.0 <= Odd < 4.0 Definitely variable
8 4.0 <= Odd < 10.0 Definitely variable
9 10.0 <= Odd < 30.0 Definitely variable
10 30.0 <= Odd Definitely variable

Examples

Example 1

glvary infile="acis_evt2.fits[sky=region(src.reg),ccd_id=3]"
effile="fracarea.fits[cols time,dtf=fracarea]" outfile=gl_prob.fits
lcfile=lc_prob.fits

Search for variability in a source which is described by the region file "src.reg" and is located on the ACIS-I3 chips (ccd_id=3). An output lightcurve and table of odds are both created.

The efficiency file, fracarea.fits, was created by running the tool dither_region:

dither_region infile=pcad_asol1.fits \
              region="region(src.reg)" \
              outfile=fracarea.fits

Example 2

glvary infile=acis_evt.fits outfile=gl_out.fits lc=gl_lc.fits
eff="dither.fits[cols time,dtf=psffrac]"

Search the events in 'acis_evt.fits' (TIME column) and output both a probability-weighted light curve and a table of odds. The PSFFRAC in the 'dither.fits' file is used to correct the time bins for intervals when the aperture moved on and off the chip.


Parameters

name type ftype def min max reqd
infile file input       yes
outfile file output       yes
lcfile file output       yes
effile file input       yes
frac real   1.0     no
seed integer   1     no
mmax integer   INDEF     no
mmin integer   INDEF     no
nbin integer   0     no
mintime float   50     no
clobber boolean input no     no
verbose integer input 0 0 5 no

Detailed Parameter Descriptions

Parameter=infile (file required filetype=input)

Input event file

The event times and GTI are used to search for time varability. If there is more than one GTI for ACIS data, the first GTI listed is used.

The input file should be filtered on the same region that was used to create the efficiency file.

Parameter=outfile (file required filetype=output)

Output probabilities

A probability is output for each division of the time interval.

Parameter=lcfile (file required filetype=output)

Probability-weighted light curve

Parameter=effile (file required filetype=input)

Optional efficiency file.

The file must have TIME and DTF columns, e.g. an HRC dead time factor (DTF), dtf1.fits, file.

This file may also be created by running the dither_region tool. A Data Model filter is used to specify the TIME and DTF columns in the file. DM syntax allows the user to specify a column with a different name as the DTF information, e.g.

"fracarea.fits[cols time,dtf=fracarea]"

This file should include all efficiency factors. So if using the dither_region file, it should also be multiplied by the DTCOR value to get the correct normalization for the lightcurve.

Parameter=frac (real not required default=1.0)

Fraction of events to use (1.0 == 100 percent)

Parameter=seed (integer not required default=1)

Random seed used to discard events if "frac" value is less than 1.0

Parameter=mmax (integer not required default=INDEF)

Maximum number of bins to split time range into. INDEF == tool will determine when to stop.

Parameter=mmin (integer not required default=INDEF)

Minimum number of bins to split time range into. INDEF == tool will determine when to stop.

Parameter=nbin (integer not required default=0)

Number of bins in the output lightcurve; 0 == tool will determine the optimal value.

Parameter=mintime (float not required default=50)

Smallest time bin to allow.

Parameter=clobber (boolean not required filetype=input default=no)

Overwrite existing output dataset with same name?

Parameter=verbose (integer not required filetype=input default=0 min=0 max=5)


Bugs

There are no known bugs for this tool.

See Also

contrib
lc_clean, lc_sigma_clip, lightcurves
tools
axbary, deflare, glvary, gti_align, monitor_photom, multi_chip_gti, pfold