update_load_seg_db

Functions

update_load_seg_db.update_loads_db(ifot_loads, dbh=None, test=False, dryrun=False)

Update the load_segments table with the loads from an RDB file.

Parameters:
  • ifot_loads – recarray of ifot run loads
  • test – allow writes of < year 2009 data
  • dryrun – do not write to the database
Return type:

list of new loads

update_load_seg_db.update_timelines_db(loads=None, dbh=None, dryrun=False, test=False)

Given a list of load segments this routine determines the timelines (mission planning weeks and loads etc) over the loads and inserts new timelines into the aca timelines db table.

In common use, this will just insert new timelines at the end of the table.

In case of scheduled replan, timelines will be updated when load segments are updated.

In case of autonomous safing, individual timelines are shortened by outside processing and act as place holders. This script will not update shortened timelines until new load segments are seen in ifot.

Parameters:
  • loads – dict or recarray of loads
  • dryrun – do not update database
Return type:

None

update_load_seg_db.weeks_for_load(run_load, dbh=None, test=False)

Determine the timeline intervals that exist for a load segment

How does this work? weeks_for_load queries the two tables created by parse_cmd_load_get.pl, tl_built_loads and tl_processing, by calling get_built_loads() and get_processing(). The tl_built_loads table stores some details for every command load from every week and every version of every week that has a processing summary in /data/mpcrit1/mplogs/.

The tl_processing table contains the information from the processing summary:
was the week a replan? what time range was used when the schedule was created? when was the processing summary created?

weeks_for_load finds the built load that matches the run load and finds the processing summary that corresponds to the built load. This processing summary entry includes the source directory. In the standard case, this single entry covers the whole duration of the load segment that was passed as an argument, and a single timeline entry mapping that time range to a directory is created.

Gotchas:

If the processing summary indicates that the load was part of a Replan/ReOpen (when some commands actually came from a different file) then that directory is determined by searching for the processing entry that matches the name of the replan_cmds entry in the processing summary. If the load time range goes outside of the processed time range, the replan_cmds source directory will be used to create a timeline to cover that time interval. If this source directory/file name needs to be manually overridden, see fix_tl_processing.py for a method to insert entries into the tl_processing table before calling this routine.

Parameters:
  • run_load – load segment dict
  • dbh – database handle for tl_built_loads and tl_processing
  • test – test mode option to allow the routine to continue on missing history
Return type:

list of dicts. Each dict a ‘timeline’

update_load_seg_db.rdb_to_db_schema(orig_ifot_loads)

Convert the load segment data from a get_iFOT_events.pl rdb table into the schema used by the load segments table

Parameters:orig_rdb_loads – recarray from the get_iFOT_events.pl rdb table
Return type:recarray
update_load_seg_db.get_built_load(run_load, dbh=None)

Given an entry from the load_segments table, return the matching entry from the tl_built_loads table :param load: run load from load_segments table :rtype: dict of matching built load

update_load_seg_db.get_processing(built, dbh=None)

Given an entry from the tl_built_loads table, return the entry for the corresponding file from tl_processing :param built: tl_built_loads entry/dict :rtype: dict of matching tl_processing entry

Table Of Contents

This Page