-
Notifications
You must be signed in to change notification settings - Fork 17
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Proposed improvements to the API #64
base: main
Are you sure you want to change the base?
Proposed improvements to the API #64
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Mostly looking good, though I'm not completely sure yet about one of the changes.
@@ -109,6 +110,7 @@ class Model(abc.ABC): | |||
|
|||
def __init__(self): | |||
self._imagenet_validation_result = None | |||
self.verbose = False |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Instead of storing this as a class variable, we could also create a global flag that controls this behavior for all models at the same time. I'm not sure yet which option I prefer.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, I see your point - I think a global flag would be even better. So far, it seems that we don't have use-cases, when several models are evaluated, but if we will at some point, it would be a bit more clean to do it like you say.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi both, I think ability to configure these options will also come handy for other tasks. For example, if this task here should be verbosely evaluated, probably all other tasks in the same run should be too.
I made a proposal for adding global config options to the package. In case we go ahead and merge this, @valentyn1boreiko you simply use the new shifthappens.config.verbose
option instead of defining this option here.
Let me know what you think (ideally directly on the referenced PR).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @valentyn1boreiko, #75 was merged now.
What do you think about the proposed solution, do you think you could leverage it to adapt this PR?
In case you dont find time, @kotekjedi might probably be able to help with this!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi, @stes, I added two lines
import shifthappens.config
and
if shifthappens.config.verbose:
in models/base.py
This should be enough, right?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @stes,
I added
available_models_dict = {'resnet18': ResNet18,
'resnet50': ResNet50,
'vgg16': VGG16}
in worst_case.py
I think it would make sense to have a model card somewhere in the global config, such that one could easy choose a model with a parameter to a script, right?
Should I open a separate PR request for this, or could you add it?
Hi all ( @valentyn1boreiko @zimmerrol @kotekjedi ) what is the status here? Are the open issue we should discuss before moving forward? I also see the potential of combining this with the changed proposed in #75 . |
I am still looking at 75#. I think the issue with global variables is addressed in the proposed |
Docstring Coverage Report
RESULT: PASSED (minimum: 0.0%, actual: 97.3%) |
Latest changes from the main repo
# Conflicts: # setup.cfg # shifthappens/data/imagenet.py
@valentyn1boreiko What is the status of this PR? It looks like you (accidentially?) added files that belong to same task to this PR. Do you mind cleaning the PR up such that we can soon merge it? |
I propose these improvements: