-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. Weβll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Data too large
error from very large data products
#133
Comments
@jordanpadams what's the best way to get a copy of the label for this product? |
@alexdunnjpl a ping is out to the user. |
@jordanpadams looking deeper into this error, it appears to be due to imminent exhaustion of the JVM heap on OpenSearch, rather than any one request/product being too large. (Presumably RAM allocation is currently 16GB on that node) The fix here is to bump up the instance size to cope with peak throughput, and/or incorporate pause/retry behaviour in harvest. Closing as a duplicate of #125 on that basis, since the fix for that is a fix for this. |
@alexdunnjpl nice sleuthing. π |
@sjoshi-jpl I see that psa is currently |
@alexdunnjpl yes this was recently bumped up based on our last conversation with @jordanpadams and @tloubrieu-jpl as we discussed how PSA could be as large / resource intensive as GEO. |
Checked for duplicates
Yes - I've already checked
π Describe the bug
When I did a harvest of a data set with some very large data products, I get a
data too large
error and the data is not loaded into the Registry.π΅οΈ Expected behavior
I expected the loaded would be nominally loaded into the Registry.
π To Reproduce
π₯ Environment Info
Linux
π Version of Software Used
3.7.6
π©Ί Test Data / Additional context
TBD
π¦ Related requirements
No response
βοΈ Engineering Details
No response
The text was updated successfully, but these errors were encountered: