-
Notifications
You must be signed in to change notification settings - Fork 370
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Trainers: skip weights and augmentations when saving hparams #1670
Conversation
Fix for #1622 |
So this fixes MoCo, but not SimCLR. Would love a more flexible way of doing this that only affects one trainer instead of all of them. We could either move Also want to add tests. Can modify one of the |
@dylanrstewart I extended your idea to make it more generalizable. Hope you don't mind. Now subclasses can decide exactly which arguments they want to ignore. |
Also added a test that would fail on main but ensures that augmentation overrides work on this branch. |
@dylanrstewart if these changes look good to you, can you agree to the CLA? That's the only remaining thing preventing this PR from being merged. See https://github.com/microsoft/contributorlicenseagreement#accepting for instructions. |
@microsoft-github-policy-service agree |
Not sure why CLA bot isn't working. Usually closing and reopening works. |
Can you try accepting again? |
Ping @dylanrstewart can you try accepting the CLA again? |
@microsoft-github-policy-service agree |
* Update base.py to fix for custom augmentations * Allow subclasses to ignore specific arguments * Fix typing * Save to self.weights * pyupgrade * Add test * Save weights --------- Co-authored-by: Adam J. Stewart <[email protected]>
Fixes #1622
Fixes #1639