-
Notifications
You must be signed in to change notification settings - Fork 39
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error while converting Hmsc model object to JSON: Error in rcpp_to_json(x, unbox, digits, numeric_dates, factors_as_string, : negative length vectors are not allowed
#190
Comments
Update: I have multiple model variants for the same locations and species. The conversion to JSON worked for some of them while some others failed. The difference between these model variants is the Knots used (location and distances between them), #samples/thin/transient values. I think there should be no problem with file size or #samples/thin/transient combinations. I can export similar models employed knot distances of 20 and 40 km, but distances of 30, 50, and 60 km failed. It is unclear why only the conversion failed for a particular GPP locations. Please note that I ensured that the locations of the GPP knots do not exactly overlap with the locations of sampling units by adding a small spatial noise (up to 100 m) if by chance any of the knots exactly overlap with the sampling units. See this issue. I can share an example model object if this would help. |
Your hypothesis that the size of Hmsc model object being converted to json is the core source of problems seems to be the most plausible one. We have observed somewhat similar issues with overflowing json format ourselves. There is definitely no issue with #samples/thin/transient at the stage of R->HPC export, since these values have no effect on the exported object size. |
Thanks @gtikhonov for your reply, Earlier, I tried the following distances. Distances of 20 and 40 km worked (15K and 4K knots), while distances 30, 50, and 60 km failed (7K, 2.8K, and 2K knots).
It seems this issue is not directly related to the number of knots used or object size. The model using 20 km knots is 8.82 GB and works while smaller models failed (30 km - 4.08 GB; 60 km - 1.58 GB). I uploaded the unfitted models to this link.
The following worked:
The following failed:
|
I am preparing data for HMSC-HPC. The model implements GPP at the European scale (52K sampling units) for 142 species and 9 covariates and
I can start sampling with no problem
However, I receive the following error when I convert the model object into JSON format.
I have used the same approach for a subset of the data (smaller study area and less number of species) without a problem. This error could be due to the large object I have [but please see my next comment below].
Is there a solution for this?
Would the Hmsc-HPC work if I try another function to convert the model object to JSON other than the
jsonify
function?Thanks
The text was updated successfully, but these errors were encountered: