hdf5_cache module¶
Caching module to store all the entries in a HDF file.
- class gemseo.caches.hdf5_cache.HDF5Cache(hdf_file_path='cache.hdf5', hdf_node_path='node', tolerance=0.0, name=None)[source]¶
Bases:
gemseo.core.cache.AbstractFullCache
Cache using disk HDF5 file to store the data.
- Parameters
hdf_file_path (str | Path) –
The path of the HDF file. Initialize a singleton to access the HDF file. This singleton is used for multithreaded/multiprocessing access with a lock.
By default it is set to cache.hdf5.
hdf_node_path (str) –
The node of the HDF file.
By default it is set to node.
name (str | None) –
A name for the cache. If
None
, usehdf_note_path
.By default it is set to None.
tolerance (float) –
By default it is set to 0.0.
- Return type
None
Warning
This class relies on some multiprocessing features, it is therefore necessary to protect its execution with an
if __name__ == '__main__':
statement when working on Windows.- cache_jacobian(input_data, jacobian_data)¶
Cache the input and Jacobian data.
- Parameters
input_data (Mapping[str, Any]) – The data containing the input data to cache.
jacobian_data (Mapping[str, Mapping[str, numpy.ndarray]]) – The Jacobian data to cache.
- Return type
None
- cache_outputs(input_data, output_data)¶
Cache input and output data.
- export_to_dataset(name=None, by_group=True, categorize=True, input_names=None, output_names=None)¶
Build a
Dataset
from the cache.- Parameters
name (str | None) –
A name for the dataset. If
None
, use the name of the cache.By default it is set to None.
by_group (bool) –
Whether to store the data by group in
Dataset.data
, in the sense of one unique NumPy array per group. Ifcategorize
isFalse
, there is a unique group:Dataset.PARAMETER_GROUP`
. Ifcategorize
isTrue
, the groups are stored inDataset.INPUT_GROUP
andDataset.OUTPUT_GROUP
. Ifby_group
isFalse
, store the data by variable names.By default it is set to True.
categorize (bool) –
Whether to distinguish between the different groups of variables. Otherwise, group all the variables in
Dataset.PARAMETER_GROUP`
.By default it is set to True.
input_names (Iterable[str] | None) –
The names of the inputs to be exported. If
None
, use all the inputs.By default it is set to None.
output_names (Iterable[str] | None) –
The names of the outputs to be exported. If
None
, use all the outputs.By default it is set to None.
- Returns
A dataset version of the cache.
- Return type
- export_to_ggobi(file_path, input_names=None, output_names=None)¶
Export the cache to an XML file for ggobi tool.
- Parameters
file_path (str) – The path of the file to export the cache.
input_names (Iterable[str] | None) –
The names of the inputs to export. If
None
, export all of them.By default it is set to None.
output_names (Iterable[str] | None) –
The names of the outputs to export. If
None
, export all of them.By default it is set to None.
- Return type
None
- get(k[, d]) D[k] if k in D, else d. d defaults to None. ¶
- items() a set-like object providing a view on D's items ¶
- keys() a set-like object providing a view on D's keys ¶
- update(other_cache)¶
Update from another cache.
- Parameters
other_cache (gemseo.core.cache.AbstractFullCache) – The cache to update the current one.
- Return type
None
- static update_file_format(hdf_file_path)[source]¶
Update the format of a HDF5 file.
See also
- Parameters
hdf_file_path (str | Path) – A HDF5 file path.
- Return type
None
- values() an object providing a view on D's values ¶
- property hdf_file: gemseo.caches.hdf5_file_singleton.HDF5FileSingleton¶
The hdf file handler.
- property last_entry: gemseo.core.cache.CacheEntry¶
The last cache entry.
- lock: RLock¶
The lock used for both multithreading and multiprocessing.
Ensure safe multiprocessing and multithreading concurrent access to the cache.
- lock_hashes: RLock¶
The lock used for both multithreading and multiprocessing.
Ensure safe multiprocessing and multithreading concurrent access to the cache.