Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About the rotation pretraining #23

Open
corwinliu9669 opened this issue Sep 24, 2021 · 3 comments
Open

About the rotation pretraining #23

corwinliu9669 opened this issue Sep 24, 2021 · 3 comments

Comments

@corwinliu9669
Copy link

I am trying to reproduce your results on miniImageNet and tieredImageNet. I can reproduce the result of S2M2 with the given rotation weights on miniImageNet. But when I train the rotation task by myself, the results can not the match the performance of the given rotation weights. I wonder whether you train the rotation with multi GPUs or there are other tricks. And I find that the fc dimension is 200 for miniImageNet, it is weird. I think it should be 64 for miniImageNet. Furthermore I do not find the rotation weight for tieredImageNet, could you kindly release the rotation weights for tieredImageNet?

@doris797
Copy link

Can you tell me how you fine-tune the novel class?

@nupurkmr9
Copy link
Owner

Hi, in case miniImageNet and tieredImageNet dataset with rotation self-supervision, we train for 400 and 100 epochs. Batch size is kept to 64 and train_aug flag is enabled during backbone training.
While evaluating on novel classes, only a linear network is trained on the backbone features. "Few-shot evaluation" section of the Readme mentions the commands for these. save_features.py saves the features and test.py trains a linear network over these features.
Hope this resolves the doubts regarding training and novel class evaluation.

@doris797
Copy link

thank you very much! can i say the classifier for the Novel class is a small sample training but the feature extractor is not...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants