You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I noticed that the LZ4 decompression via hdf5plugin 4.1.0 and later is 5-6x slower than with hdfplugin 4.0.1, while the compression speed is very similar:
gives the following results for different hdf5plugin version with h5py 3.12.1 on Python 3.11.9 on Windows 10 (AMD Ryzen 7 5900X):
hdf5plugin 4.0.1: lz4 compression time 0.219s, lz4 decompression_time: 0.283s
hdf5plugin 4.1.0: lz4 compression time 0.226s, lz4 decompression_time: 1.630s
hdf5plugin 5.0.0: lz4 compression time 0.221s, lz4 decompression_time: 1.610s
I have seen similar results on Python 3.8 and 3.11 on Debian 12 with different h5py versions.
I would have expected a substantial speedup after updating to version 5.0 with update to lz4 1.10 with the new multithreaded decompression compared to 4.1.x, but the decompression speed for 4.1.x and 5.0 seems to be the same and not using multithreaded lz4 decompression.
The text was updated successfully, but these errors were encountered:
Dalbasar
changed the title
Performance regression for lz4 decompression after version version 4.0.1
Performance regression for lz4 decompression after version 4.0.1
Oct 29, 2024
I noticed that the LZ4 decompression via hdf5plugin 4.1.0 and later is 5-6x slower than with hdfplugin 4.0.1, while the compression speed is very similar:
gives the following results for different hdf5plugin version with h5py 3.12.1 on Python 3.11.9 on Windows 10 (AMD Ryzen 7 5900X):
I have seen similar results on Python 3.8 and 3.11 on Debian 12 with different h5py versions.
I would have expected a substantial speedup after updating to version 5.0 with update to lz4 1.10 with the new multithreaded decompression compared to 4.1.x, but the decompression speed for 4.1.x and 5.0 seems to be the same and not using multithreaded lz4 decompression.
The text was updated successfully, but these errors were encountered: