Releases: predsci/psi-io
v2.0.7
PATCH NOTES
- Include the ability to attach attributes to Datasets in write routines.
- These key-value pairs can be passed through as
**kwargsin thewrite_hdf_dataroutine (and the relevant functions that dispatch to this base writer).
- These key-value pairs can be passed through as
- Add the
convertandconvert_psih4_to_psih5for converting between HDF versions.- The former copies the exact data/attributes from one version to another.
- The latter enforces PSI conventions when converting between HDF4 and HDF5.
- This release has significantly updated the API documentation and added examples for writing/convert HDF files.
v2.0.6
PATCH NOTES
- Included
sync_dtypefunctionality to HDF writers.- When
True, each scale is cast to the datatype of the corresponding Dataset. - This behavior is necessary for certain Fortran tools within the PSI software ecosystem.
- The internal logic of such tools rely on uniform precision between a dataset and its coordinates.
- When
- By default,
wrhdf_1dwrhdf_2dandwrhdf_3ddefault to datatype synchronization, while the more generalwrite_hdf_dataroutine does not.- This design choice was made to mimic legacy PSI HDF writers, while also allowing newer write-routines to be flexible in their datatype protocols.
v2.0.5
PATCH NOTES
- Minor patch to pin
h5pyversionh5pyis now pinned to be >=3.8- Version 3.8 introduced the
Dataset.is_scaleproperty, used throughout the HDF5 dispatch methods inpsi_io.- NOTE: If using the standard PSI Conda Recipe, be sure to update your
h5pyversion to accommodate this recent change. - If you identify any other dependency conflicts, please submit these to the Issue Tracker
- NOTE: If using the standard PSI Conda Recipe, be sure to update your
v2.0.4
PATCH NOTES
- Added new
write_hdf_datafunction:- Designed to handle both HDF4 and HDF5 writing for canonical "PSI-style" datasets.
- Extends previous writers' functionality, allowing one to write datasets with no scales (or scales attached to arbitrary dataset dimensions) with non-canonical dataset identifiers e.g. allowing one to convert CHMap Database files from HDF5 to HDF4 or vice versa.
- Resolves issues with HDF4 scale writing; input datatypes are now preserved (whereas previous writers upcast to
float64to account for issues with SWIG bindings). - Enforces datatype inputs across writers.
- NOTE:
pyhdfdoes not support the use offloat16andint64datatypes.
- NOTE:
- Refactored legacy writers to call
write_hdf_data:- Existing writers' API signatures –
wrhdf_1d,wrhdf_2d, andwrhdf_3d– remain unchanged to allow for backwards compatibility. wrhdf_1d,wrhdf_2d, andwrhdf_3d(functionally) are calls towrite_hdf_datausing the default "PSI-style" dataset identifiers, along with a dimensionality enforcement check.- NOTE: due to this refactor, the legacy writers no longer perform upcasting on coordinate variables – see notes for the new
write_hdf_datafunction above.
- NOTE: due to this refactor, the legacy writers no longer perform upcasting on coordinate variables – see notes for the new
- Existing writers' API signatures –
v2.0.2
MAJOR RELEASE
- Unified the API for reading PSI's HDF4 and HDF5 files. The unified reading/writing routines also handle files with non-PSI-standard data structures (and have resolved some of the idiosyncrasies with HDF4 dimension datatypes).
- Completed documentation of the
psi-iopackage API and added some simple examples. - Added a comprehensive test suite that tests both HDF4 and HDF5 versions of the routines.
Breaking Changes
-
All of the "newer" read/writing/interpolation routines take – as their first positional argument – the ifile parameter (Path or str). Therefore, when integrating these changes over version 1.0, your "find and replace" approaches should look for:
read_hdf_metaread_rtp_metaread_hdf_by_indexread_hdf_by_valuenp_interpolate_slice_from_hdfsp_interpolate_slice_from_hdfinterpolate_positions_from_hdf
-
The
np_interpolate_slice_from_hdffunction now returns the data array in the proper Fortran order (to be consistent with all of the other routines in the package).
Other notes
- Usage of the "classic" readers
rdhdf_1d,rdhdf_2d,rdhdf_3dhelper functions remains the same, but they are now wrappers for the newerread_hdf_data. - The new unified writer (
write_hdf_data) can handle a more diverse range of datatypes for HDF4 (along with saving non-standard datasets e.g. from the Coronal Hole Map database). The "classic" writerswrhdf_1d,wrhdf_2d,wrhdf_3dremain the same for now.
v1.0.0 (initial version)
Initial consolidation of our HDF reading routines into a single pip-installable Python package, which is now available as psi-io on PyPi.
Users familiar with psihdf.py or psi_io.py should be able to use this as a seamless drop-in replacement, e.g.:
# if you used to use the standalone psi_io.py:
import psi_io
# if you used to use the standalone psihdf.py and don't want to edit existing code:
import psi_io as psihdfThe package was designed to not require HDF4, but if pyhdf is present, .hdf files can be read no problem.