-
-
Notifications
You must be signed in to change notification settings - Fork 16.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Transfer Learning - Freezing Parameters #679
Comments
Hello @shubhamag01, thank you for your interest in our work! Please visit our Custom Training Tutorial to get started, and see our Jupyter Notebook , Docker Image, and Google Cloud Quickstart Guide for example environments. If this is a bug report, please provide screenshots and minimum viable code to reproduce your issue, otherwise we can not help you. If this is a custom model or data training question, please note Ultralytics does not provide free personal support. As a leader in vision ML and AI, we do offer professional consulting, from simple expert advice up to delivery of fully customized, end-to-end production solutions for our clients, such as:
For more information please visit https://www.ultralytics.com. |
@shubhamag01 this is the default behavior when a pretrained model is specified: |
@glenn-jocher is it possible to train, removing last few layers and using pretrained model and add new layers and train freezing rest of untouched layers |
@shubhamag01 you can do whatever you want |
@glenn-jocher is there a difference between python3 train.py --data coco1cls.data --cfg yolov3-spp.cfg --weights weights/yolov3-spp.pt and python3 train.py --data coco1cls.data --cfg yolov3-spp.cfg --weights weights/yolov3-spp.pt --transfer I assumed that pre-trained weights was the idea behind transfer learning, then I found the tutorial on Transfer Learning it with --transfer command specified. |
@karen-gishyan your argument does not exist in train.py. See the argparser arguments at the end of train.py for a list of available arguments. |
thanks @glenn-jocher. |
@glenn-jocher I had made a custom yolov5 model and i ran |
@ska6845 I've been asked this multiple times, so I've added a section to train.py that handles freezing parameters: Lines 76 to 83 in e71fd0e
You can add any parameters you want to this list, with full or partial names, to freeze them before training starts. This code freezes all weights, leaving only biases with active gradients: # Freeze
model.info()
freeze = ['.weight', ] # parameter names to freeze (full or partial)
if any(freeze):
for k, v in model.named_parameters():
if any(x in k for x in freeze):
print('freezing %s' % k)
v.requires_grad = False
model.info() Output: Model Summary: 191 layers, 7.46816e+06 parameters, 7.46816e+06 gradients
freezing model.0.conv.conv.weight
freezing model.0.conv.bn.weight
freezing model.1.conv.weight
freezing model.1.bn.weight
...
Model Summary: 191 layers, 7.46816e+06 parameters, 11453 gradients |
@glenn-jocher TODO: update this to act correctly with optimizer parameter grouping (pg0-gp2): Lines 89 to 91 in e71fd0e
|
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. |
Removing TODO as this fix is incorporated in PR #1239. Layer freezing functionality now operates correctly in all cases. To freeze layers, simply add their names to the Lines 83 to 90 in 187f7c2
|
@glenn-jocher I wanted to freeze the backbone part of yolov5l configuration. Could you please tell me how to do it? And also will freezing the layers help in any way except decreased time in training. I am using coco pretrain for the logo-detection problem. |
@pngmafia freezing layers will reduce your mAP. You can add the names of the parameters you'd like to freeze to the print(model.info(verbose=True)) |
@glenn-jocher I don't mind a decrease in mAP. But I need recall to be high. And also is there a way to give recall more weight than mAP? |
@pngmafia recall is not a universal metric, it depends on your conf. If you want maximum recall, all you need to do is set conf_thres to zero. Then you will have 100% recall. |
@glenn-jocher can't I use the fitness function in utils. Use different weights for those four metrics? P R mAP:.5 and mAP:.5:.95. I see its 0 0 0.1 and 0.9 now. |
@pngmafia sure, you can customize the hyperparameter evolution fitness function as you see fit. See hyperparameter evolution tutorial in https://docs.ultralytics.com/yolov5 |
@glenn-jocher Does changing those weights to 0 0.8 0.1 0.1 give me better recall? As compared to 0 0 0.1 and 0.9. Considering I use the same confidence threshold for both the experiments. |
@pngmafia hyperparameter evolution maximizes the fitness function here: Lines 926 to 930 in 187f7c2
Normal training minimizes loss on your training dataset, and is unrelated to hyperparameter evolution. |
@glenn-jocher so this fitness function is just used only when we train with --evolve option. If I'm training normally on my dataset then, It really doesn't matter what weights I give in that function right? |
@pngmafia that's correct. |
@glenn-jocher thanks for including these changes to freeze params. |
@vedal see Transfer Learning with Frozen Layers tutorial below: YOLOv5 Tutorials
|
@glenn-jocher cant believe I missed this! Thanks alot! :) |
Does yolov5 support transfer learning?
While training models, is there a possibility to use pretrained weights and modify last few layers?
The text was updated successfully, but these errors were encountered: