-
Notifications
You must be signed in to change notification settings - Fork 18
Add ZFP Compression Filter #124
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
|
I reran this current build on my own fork with the came CI.yml build as this and every test passed. Don't know why it didn't pass here. |
|
That is just the buggy 1.8 release which is rather flakey |
mulimoen
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is a big undertaking and you seemed to have gotten it mostly right! There is a missing entry in src/hl/dataset.rs:1003 and a missing entry in the changelog
…d teh filter calls with feature dependent macro implementation like the other features
…ot natively take the number of bytes needed for the processing but instead it uses the dimensionality of the compressed array to do the data compression. This information is captured byt the ZfpConfig Struct. This takes care of allocating the appropriate buffer size for the compression downstream from here.
|
Something strange is happening as this is producing files 2x bigger than the reference MATLAB HDF5 implementation |
…ut size for set_local_zfp to the right values
|
Are you using the same chunk size? |
… need to finish the tests
|
Did you figure out the difference to the matlab produced file? I can see the filter is significantly compressing some dummy data (hdf5/examples/chunking.rs) |
|
I did find the answer. The filters were working fine as implemented in the crate. However, if you wrote a file out and then tried to parse it via h5py/matlab the filter header information was not packed the same way causing a malformed read error. I think it’s figured out but I want to mull the inputs. On my phone, but currently the call on the builder is now: I couldn’t fine a robust way to get the as built chunk dims AND the element type from just the plist parameters. Updating the filter call would be a much more significant refactor I think as every other filter arg would need to change. Currently I’m inclined to require the user to have to redundantly specify data type and chunk dims but it would be cleaner without it. |
|
This should be ready to go assuming it passes the right test s |
Closes #123
This PR should add ZFP as a compression filter. First contribution to a rust project so I would appreciate any review.
This should add ZFP as an optional feature that can be implemented as such:
or