-
-
Notifications
You must be signed in to change notification settings - Fork 18.4k
HDF corrupts data when using complib='blosc:zlib' #8874
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
hmm, I don't think you can pass in a compressor like that, its either 'blosc' OR 'zlib' |
related to this issue: #4582 |
You can (sometime) in pytables - doesn't mean it is OK for pandas though. One example is here. |
interesting didn't know that sorry you said But this is pretty much passed straight thru to pytables. |
Desription:
Code:
|
So this is something I don't really understand about HDF5. You can open the file with compression and/or compress an individual node by using a compressed storage format (e.g. a but this is not allowed for certain types (e.g. Panel) because of how they are stored. So you can only store via 'table' format, but not fixed (I don't remember specifically why). So you can use what I put above to actually store it (via table format). Separately this is not reporting the errors correctly It think when you try to compress the entire store. |
Thanks - hard to change old habits, but will use table in the future. |
Add check for complib when opening a HDFStore closes pandas-dev#4582 closes pandas-dev#8874
closed by #10341 |
I'm not sure if this is supported or not -- it isn't in the doc string for HDFStore, but it seems to be allowed by the HDFStore (nothing is raised).
Unfortunately so far I can only get it to show the bad behavior on a proprietary dataset, which is storing a
pd.Panel
which contains items of of mixed types.Some of the float values are be changed from small (|x|<1.0) to very large (3.xe+308).
Is the compressor just passed through to pytables? If so, this might be a pytables issue.
The text was updated successfully, but these errors were encountered: