I get an unusual error when processing a large set of HAWK-I frames.

 

HAWK-I is a survey instrument that can produce large data volumes.   Therefore, it is not amenable to processing on a PC or laptop with limited memory resources.  Depending on the number of frames processed, which sky correction algorithm used, and the number of images combined into a tile in the science processing routine, the memory requirements can become quite large.

This can result in strange errors in which the pipeline cannot complete its image stacking, sky correction, or source detections.  The error reported by the Reflex workflow can be rather cryptic and include:

Error in invoking the fire method

Couldn’t find dataset with name UNDEFINED

Error writing file

or a number of other ill-defined errors.

Table 4.1 (reproduced below from the HAWK-I Pipeline Manual) lists the absolute minimum memory requirements for the execution of select HAWK-I recipes.  The reason that hawki_science_process with 20 files uses more memory than with 50 files is because (using default recipe parameters) the stacking method changes from ’fast’ to ’slow’ (see description of stk_fast and stk_nfst parameters in §9.1 of the Pipeline Manual).

Recipe # Science Frames Min. RAM (GB)
hawki_standard_process 4 3.2
hawki_science_process 5 6.6
hawki_science_process 10 10.5
hawki_science_process 20 17.3
hawki_science_process 50 7.9

If the minimum RAM cannot be accommodated, there is an expert mode that either uses memory or disk I/O to combine images.  The stacking module used by the HAWK-I pipeline recipes for the science observations has two modes of operation, a slow and a fast algorithm.  The algorithm needs to stack not just the science data, but also the science variance data. It creates an output stack, output stack variance and an output stack confidence map. The volume of data that is read and written can make the memory required quite large and this can be increased further if the jitter offsets are large too.

 

The ’fast’ algorithm can be used when stacking a small number of images. As the name implies it is usually pretty quick, but it is very greedy with memory.  For stacking problems with more images it is better to use the slow algorithm.  The recipes that use the stacking module also offer an auto mode.  This allows the recipe to decide which algorithm to use. If the number of input frames is less than or equal to stk_nfst, the ’fast’ mode is used.  If you click on the “science processing” actor you can modify the parameter “init_stk_nfst” and set it to a small value (say, 10 or 20).  The default is 300, which means that the routine will try to stack a maximum of 300 HAWK-I images in memory.