Error with malloc and memory limit #522
-
HelloI have received a new version of the model and I found a "random" error related with the outputs. Edited by Chris: Adding a text snippet of the error message here for searchability:
After removing some variables form the output.csv file I realized that the problem was not caused by any of them but by having too many. I managed to add the -S ASSERTIONS=1 to the wasm plugin: wasmPlugin({
emccArgs: [
"-Wall", "-Os ",
"-sSTRICT=1 ",
"-sMALLOC=emmalloc",
"-sFILESYSTEM=0 ",
"-sMODULARIZE=1 ",
"-sSINGLE_FILE=1 ",
"-sEXPORT_ES6=1 ",
"-sUSE_ES6_IMPORT_META=0 ",
"-sENVIRONMENT='web,webview,worker' ",
"-sEXPORTED_FUNCTIONS='_malloc','_free','_getInitialTime','_getFinalTime','_getSaveper','_setLookup','_runModelWithBuffers']",
"-sEXPORTED_RUNTIME_METHODS=['cwrap']",
"-sASSERTIONS=1"] }), (note: I recommend including an example of this in the documentation since I didn't know that -s params don't have a space between "-s XXXXPARAMXXX" I can add it if you want) Edited by Chris: Adding a text snippet of the error message here for searchability:
Which, as suspected, is due to excess memory consumption. seems to fix the problem. |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments
-
Hi @serman, thanks for submitting this. I agree that this info can be helpful to others who might run into similar issues, so I converted this to a discussion thread and edited your post to include text versions of the error messages you saw for better searchability. For the I think that either solution, We could consider changing the default args to include I will think about this some more and will consider filing an issue about changing the default parameters, but for now, glad that you were able to figure out emcc args that work well for your use case. |
Beta Was this translation helpful? Give feedback.
-
Thanks for the editing. I think the main problem is not the model size (which is -of course- important) but the fact that I need to read many variables for 8 regions. (subscripts) Currently I have 491 variables in the outputs.csv I also noticed that in the latest model version that I received, the "SAVEPER" variable was incredibly small. So It was trying to allocate memory space for too many points for each variable. After I reduced the SAVEPER to 1 ( I don't need more than one datapoint per year) It seems It doesn't reach the memory limit.
|
Beta Was this translation helpful? Give feedback.
Hi @serman, thanks for submitting this. I agree that this info can be helpful to others who might run into similar issues, so I converted this to a discussion thread and edited your post to include text versions of the error messages you saw for better searchability.
For the
-s
args, I filed a separate issue #523 to improve the documentation.I think that either solution,
-sINITIAL_MEMORY=32MB
or-sALLOW_MEMORY_GROWTH=1
would be fine. I don't think 32MB should be too much for modern browsers/devices. We haven't hit the limit with the default settings in En-ROADS or C-ROADS, but I'm not surprised for it to happen with large models like yours.We could consider changing the default args to …