Interfaces

The primary interfaces for the SAS are between the IOC and the SSC. The IOC interface delivers data of a known format for the Data Processing Facility to handle, while the SSC interface allows transfer of a variety of data types, plus algorithms.

IOC Interface

There are two components to the IOC interface:

Incoming data

The data received from the IOC will be corrected for downlink errors. It will then be made available to the DPF. Our initial concept is that the IOC and SAS will share the database that describes the Level 0 and 1 data. The IOC will update the database as to the availability of new Level 0 data, giving a description of it (command state, etc) and location. From the command state, the data format type can be deduced. The automated server will detect the addition of new data and take the appropriate action on it, updating the database as it goes. It is assumed that the IOC and DPF are co-located, so that the data location will be shared disk.

Consequently, the interfaces required are:

Diagnostics

The diagnostics will take the form of statistics and plots. These diagnostics will be tracked in the database, and viewable via the web.

The design of the operator interface will depend on directions the IOC takes in its operations software, so not much of this interface can be set yet. The main issue will be how the operators log any variances they see from the DPF diagnostics into their own system.

SSC interface [Seth]

Draft 16 July 2001

The LAT team is required to deliver all mission data to the SSC, both for archiving and in support of the GI program.  Because the Event, Event Summary, and Timeline databases will be dynamic and central to the (post-Level 1) analysis system, and because the LAT team is mandated to transfer its high-level analysis environment to the SSC, the interface for high-level data between the SAS and SSC is planned to be via database mirroring.  This will facilitate duplication of the high-level analysis environment at the SSC.  The same mirroring of databases will also facilitate the planned establishment of additional sites for high-level processing within the LAT team. 

The High-Level Calibration database (from which instrument response functions are extracted) and the Diffuse Emission Model database are also central to the high-level analysis environment but they are expected to change infrequently.  These need to be provided to the SSC, and the transfer could be implemented as mirrors, but perhaps manual export/import of databases could be used for these. 

Other databases used in the LAT analysis evironment that should be shared include the Point Source Catalog and the Pulsar Ephemerides.  The Point Source Catalog database used by the LAT team may not be the same as what is provided to the SSC.  At the least, a schedule of release dates for updates will be established.  For GI analysis, the Point Source Catalog is better not to be a fluid entity, changing daily.  The Pulsar Ephemerides, likely established in collaboration with radio pulsar timing groups, will be needed for barycenter correction of photon arrival times for pulsar studies.  The Ephemerides must be current (e.g., to account for timing glitches), but the database will be very small (kbytes) and quite likely just a flat file will suffice. 

The lowest-level mission data will likely not be part of standard (post Level-1) analysis, and rarely will be subject to further scrutiny outside of the DPF.  The Level 0 data (and housekeeping data) will be delivered to the SSC as flat files. 

The specific interfaces between the SAS and SSC for the databases are still being planned.  The mechanisms for synchronizing remote databases are being defined; we may borrow from SLAC experience with BABAR and other experiments.  In terms of the transfer of the lowest-level data, such as Level 0, the interface is better understood.  The Level 0 data will be transferred as it becomes available, on a daily (or 1/2-daily) basis.  The SAS will alert the SSC that new files have been staged for transfer.  The SSC will transfer and validate the files using pre-computed checksums, then send confirmation to the SAS that the transfer was successful.


R.Dubois, S.Digel Last Modified: 07/16/2001 10:24