-
-
Notifications
You must be signed in to change notification settings - Fork 265
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
H5Centry.c:1232: H5C__load_entry: Assertion `entry->size < H5C_MAX_ENTRY_SIZE' failed. #3762
Comments
Hi Gerd,
Can you try to increase the size of H5C_MAX_ENTRY_SIZE in H5Cprivate.h and
also try the latest format with the initial value?
Elena
…On Tue, Oct 24, 2023 at 8:42 AM Gerd Heber ***@***.***> wrote:
Creating a large number of datasets fails with the assertion:
H5Centry.c:1232: H5C__load_entry: Assertion `entry->size < H5C_MAX_ENTRY_SIZE' failed.
The error occurs with the current development branch under Debian 12
x86_64 with GCC 13.2.0.
A reproducer is shown below:
#include "hdf5.h"
int main()
{
hid_t file = H5Fcreate("why.h5", H5F_ACC_TRUNC, H5P_DEFAULT, H5P_DEFAULT);
hsize_t dims[3] = {18, 18, 15};
hid_t fspace = H5Screate_simple(3, dims, NULL);
float data[18][18][15];
for (size_t i = 0; i < 18; i++)
{
for (size_t j = 0; j < 18; j++)
{
for (size_t k = 0; k < 15; k++)
{
data[i][j][k] = (float) (i * j * k);
}
}
}
#ifdef USE_SUB_GROUP
hid_t group = H5Gcreate(file, "group", H5P_DEFAULT, H5P_DEFAULT, H5P_DEFAULT);
#endif
char name[20];
/*
* 1441790 -> file created successfully, h5stat is happy
*
* 1441791 -> file created successfully, h5stat barfs:
* h5stat: H5Centry.c:1232: H5C__load_entry: Assertion `entry->size < H5C_MAX_ENTRY_SIZE' failed.
*
* 1441792 -> dataset creation fails with:
* a.out: H5Centry.c:1232: H5C__load_entry: Assertion `entry->size < H5C_MAX_ENTRY_SIZE' failed.
*/
for (size_t i = 0; i < 1441791; ++i)
{
sprintf(name, "data%06lu", i);
// Using a subgroup doesn't save our bacon. Same error behavior.
#ifdef USE_SUB_GROUP
hid_t dset = H5Dcreate(group, name, H5T_NATIVE_FLOAT, fspace, H5P_DEFAULT, H5P_DEFAULT, H5P_DEFAULT);
#else
hid_t dset = H5Dcreate(file, name, H5T_NATIVE_FLOAT, fspace, H5P_DEFAULT, H5P_DEFAULT, H5P_DEFAULT);
#endif
H5Dwrite(dset, H5T_NATIVE_FLOAT, H5S_ALL, H5S_ALL, H5P_DEFAULT, data);
H5Dclose(dset);
}
H5Sclose(fspace);
#ifdef USE_SUB_GROUP
H5Gclose(group);
#endif
H5Fclose(file);
return 0;
}
—
Reply to this email directly, view it on GitHub
<#3762>, or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ADLFT3JXYO5XJZKGUY25J3TYA7OVDAVCNFSM6AAAAAA6N5E346VHI2DSMVQWIX3LMV43ASLTON2WKOZRHE2TSNJVGM4TQNA>
.
You are receiving this because you are subscribed to this thread.Message
ID: ***@***.***>
|
Good points!
That works, but it is cheating.
This works (with the initial value), but it's still covering up a deeper problem. And for this "use case," there is also a substantial performance regression. |
Well... I am not sure if it is cheating... Metadata cache size has a limit
(32MB) that can be increased by changing the value and recompiling the
library.
Earliest format allows local heaps to grow, and apparently, at some point a
metadata item (local heap) becomes bigger than 32 MBs. New file format uses
a fractal heap to prevent such a situation. I think developers will have a
better insight.
Elena
…On Wed, Oct 25, 2023 at 6:48 AM Gerd Heber ***@***.***> wrote:
Good points!
Can you try to increase the size of H5C_MAX_ENTRY_SIZE in H5Cprivate.h
That works, but it is cheating.
try the latest format with the initial value?
H5Pset_libver_bounds(fapl, H5F_LIBVER_V110, H5F_LIBVER_V110);
This works (with the initial value), but it's still covering up a deeper
problem.
—
Reply to this email directly, view it on GitHub
<#3762 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ADLFT3KAPKGBILU43RJZNOTYBEKB5AVCNFSM6AAAAAA6N5E346VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTONZZGMYTQMJSGA>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
If the library tries to load a metadata object that is above the library's hard-coded limits, the size will trip an assert in debug builds. In HDF5 1.14.4, this can happen if you create a very large number of links in an old-style group that uses local heaps. The library will now emit a normal error when it tries to load a metadata object that is too large. Partially addresses GitHub HDFGroup#3762
If the library tries to load a metadata object that is above the library's hard-coded limits, the size will trip an assert in debug builds. In HDF5 1.14.4, this can happen if you create a very large number of links in an old-style group that uses local heaps. The library will now emit a normal error when it tries to load a metadata object that is too large. Partially addresses GitHub #3762
We've added a fix for the cache assert in 1.14.4 and will fix the underlying heap issue in 1.14.5. |
If the library tries to load a metadata object that is above the library's hard-coded limits, the size will trip an assert in debug builds. In HDF5 1.14.4, this can happen if you create a very large number of links in an old-style group that uses local heaps. The library will now emit a normal error when it tries to load a metadata object that is too large. Partially addresses GitHub HDFGroup#3762
* Call memset before stat calls (#4202) The buffers passed to stat-like calls are only partially filled in by the call, leaving ununitialized memory areas when the stat buffers are created on the stack. This change memsets the buffers to 0 before the stat calls, quieting the -fsanitze=memory complaints. * Remove unused CMake configuration checks (#4199) * Update link to Chunking in HDF5 page (#4203) * Fix H5Pset_efile_prefix documentation error (#4206) Fixes GH issue #1759 * Suggested header footer for NEWSLETTER (#4204) * Suggested header footer for NEWSLETTER * Updates * Add NEWSLETTER.txt to h5vers script * Fix grammar in README.md release template (#4201) * Add back snapshot names (#4198) * Use tar.gz extension for ABI reports (#4205) * Fix issue with Subfiling VFD and multiple opens of same file (#4194) * Fix issue with Subfiling VFD and multiple opens of same file * Update H5_subfile_fid_to_context to return error value instead of ID * Add helper routine to initialize open file mapping * Reverts AC_SYS_LARGEFILE change (#4213) We previously replaced local macros with AC_SYS_LARGEFILE, which is unfortunately buggy on some systems and does not correctly set the necessary defines, despite successfully detecting them. This restores the previous macro hacks to acsite.m4 * Propagate group creation properties to intermediate groups (#4139) * Add clarification for current behavior of H5Get_objtype_by_idx() (#4120) * Addressed Fortran issues with promoted integers and reals via compilation flags (#4209) * addressed issue wit promoted integers and reals * added option to use mpi_f08 * Summarize the library version table (#4212) Fixes GH-3773 * Fix URLs (#4210) Also removed Copyright.html context because it's no longer valid. * Fix 'make check-vfd' target for Autotools (#4211) Changes Autotools testing to use HDF5_TEST_DRIVER environment variable to avoid running tests that don't work well with several VFDs Restores old h5_get_vfd_fapl() testing function to setup a FAPL with a particular VFD Adds a macro for the default VFD name * Revert "Addressed Fortran issues with promoted integers and reals via compil…" (#4220) This reverts commit 06c42ff. * Backup and clear CMAKE_C_FLAGS before performing _Float16 configure checks (#4217) * Fix broken links (#4224) * Fix broken URLs in documentation (#4214) Fixes GH-3881 partially. There are pages that need to be recreated. * Avoid file size checks in C++ testhdf5 for certain VFDs (#4226) * Fix an issue with type size assumptions in h5dumpgentest (#4222) * Fix issue with -Werror cleanup sed command in configure.ac (#4223) * Fix Java JNI warnings (#4229) * Rework snapshots/release workflows for consistent args (#4227) * Fixed a cache assert with too-large metadata objects (#4231) If the library tries to load a metadata object that is above the library's hard-coded limits, the size will trip an assert in debug builds. In HDF5 1.14.4, this can happen if you create a very large number of links in an old-style group that uses local heaps. The library will now emit a normal error when it tries to load a metadata object that is too large. Partially addresses GitHub #3762 * Set DXPL in API context for native VOL attribute I/O calls (#4228) * Initialize a variable in C++ testhdf5's tattr.cpp (#4232) * Addressed Fortran issues with promoted integers and reals via compilation flags, part 2 (#4221) * addressed issue wit promoted integers and reals * fixed h5fcreate_f * added option to use mpi_f08 * change the kind of logical in the parallel tests * addressed missing return value from callback * Use cp -rp in test_plugin.sh (#4233) When building with debug symbols on MacOS, the cp -p commands in test_plugin.sh will attempt to copy the .dSYM directories with debugging info, which will fail since -r is missing. Using cp -rp is harmless and allows the test to run Fixes HDFFV-10542 * Clean up types in h5test.c (#4235) Reduces warnings on 32-bit and LLP64 systems * Fix example links (#4237) * Fix links md files (#4239) * Add markdown link checker action (#4219) * Match minimum CMake version to 3.18 (#4215)
Creating a large number of small datasets fails with the assertion:
The error occurs with the current development branch under Debian 12 x86_64 with GCC 13.2.0.
A reproducer is shown below:
The text was updated successfully, but these errors were encountered: