DUECA/DUSIME
|
For logging, a generic module to log channel data to HDF5 files is available.
A quote on HDF5 from the HDF website
HDF5 is a data model, library, and file format for storing and managing data. It supports an unlimited variety of datatypes, and is designed for flexible and efficient I/O and for high volume and complex data. HDF5 is portable and is extensible, allowing applications to evolve in their use of HDF5.
HDF5 is a very convenient format for logging data, it is widely used, can be read into Matlab or Python, and it allows for pretty flexible and efficient logging of data. Log files are hierarchical (that is what the "H" stands for), so you can organize all kinds of data in a single file. In HDF5, you can add metadata "properties" to the branches in your hierarchical data tree, so an HDF file can be made self-documented. The capability to easily convert most channel data to an HDF5 log was added for dueca 2.2, spring 2017. This uses the service functor concept introduced for channel access, see DCO service functor
HDF as a data format is oriented towards logging multi-dimensional single-type data matrices. A hierarchical path can be used to specify the location of the data in the file. Most classes of DCO object can be directly logged in the generated hdf file, however, there are some limitations on the complexity of the DCO objects.
To add hdf5 logging capability to your project, add hdf5 to the DUECA_COMPONENTS list in your main CMakeLists.txt file:
If you are still using the old Makefile-based project setup, adapt your project Makefile:
The HDF5 logging capabilities are based on additional code generated for the DCO objects. To generate this code, add a line with
This will make code that can not be nested, to get nested code, but bear in mind the limitations discussed above, specify option hdf5nest
Nested or nesteable code is for DCO objects that are to be used within other DCO objects. The limitation here is that variable-length objects cannot be used in nested DCO objects.
The generated code uses c preprocessor defines around the additional components. When you don't have hdf5 on a computer, or the –hdf5 option is not used, those parts will not be compiled, so also when you no longer need hdf5 logging, this option can safely remain in the DCO file.
There is a standard HDF5 logging module, called, surprisingly, dueca::hdf5log::HDF5Logger . To get this module to log, indicate which channel and entry must be logged, and repeat for all requested channels. There is also an option to monitor a specific channel and create logging for all entries appearing in that channel. You can specify whether logging takes place always, or only when the simulation is in advance, and it is also possible to "throttle" the logging rate, however, make sure you only do this on clean signals, since the data is not filtered before picking only a part of the data points.
It is also possible to create customized logging through the DCO service functor functionality. The functors for HDF5 logging are dueca::hdf5log::HDF5DCOWriteFunctor, dueca::hdf5log::HDF5DCOReadFunctor, and dueca::hdf5log::HDF5DCOMetaFunctor
The logging module is configured with the following script commands:
dueca.Module('hdf5-logger', <part name; string>, <PrioritySpec>).param( set_timing = <TimeSpec>, # Supply a time specification to define the update rate of the main activity check_timing = <array of integers>, # Supply three integer parameters to specify a check on the timing of # the main activity: warning limit (in us), critical limit (in us), and # the number of loops to test before sending a report (optional, dflt=2000) log_entry = <array of strings>, # log a specific channel entry; enter channel name, dataclass type, if # applicable entry label and as last the path where the data should be # stored in the file. Without label, only the first entry is logged, # with, only the first entry matching the label watch_channel = <array of strings>, # log all entries in a specific channel; enter channel name and path # where entries should be stored filename_template = <string>, # Template for file name; check boost time_facet for format strings # Default name: datalog-%Y%m%d_%H%M%S.hdf5 log_always = <boolean>, # For watched channels or channel entries created with log_always, # logging also is done in HoldCurrent mode. Default off, toggles # this capability for logging defined hereafter. immediate_start = <boolean>, # Immediately start the logging module, do not wait for DUECA control # chunksize = <uint32_t>, # Size of logging chunks (no of data points) for the log file, # in effect for all following entries. compress = <boolean>, # Log compressed data sets; reduces file size and may increase # computation time. In effect for all following entries reduction = <TimeSpec>, # Reduce the logging data rate according to the given time # specification. Applies to all following logged values config_channel = <string>, # Specify a channel with configuration events, to control logging # check DUECALogConfig doc for options ) '''Description: Generic logging facilities for channel data to HDF5 data files. The logger may be controlled with DUECALogConfig events, but may also be run without control. Note that hdf5 may sometimes take unpredictable time (when it needs to flush data to disk). DUECA has no problem with that, but you are advised to configure a separate priority for the hdf5 modules.'''
A matching hdf5 replayer can replay properly formatted hdf5 files; dueca::hdf5log::HDF5Replayer. Note that both the logger and replayer can listen to channels that enable you to control the replay or log; opening files or creating log sections in files, etc.