-
Notifications
You must be signed in to change notification settings - Fork 87
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Object storage sizes #173
Comments
What is your use case for storing these objects? One reason for the size is that the Predictor is part of Interaction / FeatureEffects. But it seems not completely explanatory for the size, maybe it is stored more than once. |
The use case is that I don't want to invest the run time again, and want to have them available later e.g. for plotting or printing in comparison to other numbers calculated elsewhere. |
I have not tried it yet, but you could try setting the predictor to NULL: |
Thank you for the proposal. After setting the I think that it would be highly desirable that output objects for interactions and feature effects are more parsimonious per default (green ML!). By the way, from within R I found it quite difficult to assess object sizes. |
I have been working with an interaction forest (
intaus
) on 2000 observations that consists of the default 20000 trees (from package diversityForest). This forest uses 306 682 KB disk space. I have appliedInteraction$new
andFeatureEffects$new
to that forest and stored the resulting objects on disk (R work spaces with a single object each). I end up with the following stored object sizes:To me, these sizes appear excessive. I wonder what functionalities of these objects I might miss that justify these huge object sizes. Or would it perhaps be possible for
Interaction$new
andFeatureEffects$new
to return smaller objects without sacrificing functionality?Best, Ulrike
The text was updated successfully, but these errors were encountered: