You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello,
when processing multiple files I would like to reuse the dbc and arxml objects used in the extract_can_logging method.
This is for memory and performance reasons:
My arxml can be up to 250.000 lines and 20MB of data.
Processing more then a few files at once regularly causes memory errors and a crash of the program.
The reading of the arxml and creation of the object is the most time consuming step when processing a new file.
When processing 100 files, I don´t want to read the same arxml file a 100 times.
Is there a way to achieve this goal?
Best regards
The text was updated successfully, but these errors were encountered:
Hello,
when processing multiple files I would like to reuse the dbc and arxml objects used in the extract_can_logging method.
This is for memory and performance reasons:
My arxml can be up to 250.000 lines and 20MB of data.
Processing more then a few files at once regularly causes memory errors and a crash of the program.
The reading of the arxml and creation of the object is the most time consuming step when processing a new file.
When processing 100 files, I don´t want to read the same arxml file a 100 times.
Is there a way to achieve this goal?
Best regards
The text was updated successfully, but these errors were encountered: