-
Notifications
You must be signed in to change notification settings - Fork 27.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix RT-DETR cache for generate_anchors #31671
Conversation
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for handling this!
(Did you also finetuned RT-DETR with bfloat16? Is it okay?)
No, I didn't. I tried bfloat16 only for inference 🙂 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for fixing!
@@ -1656,7 +1656,10 @@ def unfreeze_backbone(self): | |||
param.requires_grad_(True) | |||
|
|||
@lru_cache(maxsize=32) | |||
def generate_anchors(self, spatial_shapes=None, grid_size=0.05, dtype=torch.float32, device="cpu"): | |||
def generate_anchors(self, spatial_shapes=None, grid_size=0.05): | |||
# We always generate anchors in float32 to preserve original model code |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's update this to reflect the true reason: preserving equivalence between dynamic and static anchor inference. We don't really care about whether our code matches the original model's, just that equivalent logic is used
What does this PR do?
For RT-DETR model:
generate_anchors
method, so even if we go with dynamic anchors generation we only generate it once per image of the same size.Who can review?
cc @SangbumChoi