-
Notifications
You must be signed in to change notification settings - Fork 9.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
CodeCamp #33 [Fix]:add type hints for res_layer, se_layer,normed_predictor,positional_encoding #9346
Conversation
@NoFish-528 Thanks for your PR! You can follow the comments to update your code. |
Co-authored-by: BigDong <[email protected]>
I have updated my code. : ) @BIGWangYuDong |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Line 109 should update
@@ -114,19 +119,19 @@ class SimplifiedBasicBlock(BaseModule): | |||
expansion = 1 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Line 109: super().init(*layers)
add type hints for res_layer, se_layer,normed_predictor,positional_encoding |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
- All
Default:
change toDefaults to
- some typehint are not added
- add xx=xx in some necessary place, such as super().xxx(xxx=xxx)
@@ -50,23 +56,23 @@ class NormedConv2d(nn.Conv2d): | |||
eps (float, optional): The minimal value of divisor to | |||
keep numerical stability. Default to 1e-6. | |||
norm_over_kernel (bool, optional): Normalize over kernel. | |||
Default to False. | |||
Default: False. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Default: -> Defaults to
eps: float = 1e-6, | ||
offset: float = 0., | ||
init_cfg: OptMultiConfig = None) -> None: | ||
super().__init__(init_cfg) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
init_cfg=init_cfg
@@ -54,7 +56,7 @@ def __init__(self, | |||
self.eps = eps | |||
self.offset = offset | |||
|
|||
def forward(self, mask): | |||
def forward(self, mask) -> Tensor: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the type of mask
col_num_embed: int = 50, | ||
init_cfg: MultiConfig = dict(type='Uniform', layer='Embedding') | ||
) -> None: | ||
super().__init__(init_cfg) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
init_cfg=init_cfg
mmdet/models/layers/se_layer.py
Outdated
@@ -140,7 +140,7 @@ class ChannelAttention(BaseModule): | |||
Args: | |||
channels (int): The input (and output) channels of the attention layer. | |||
init_cfg (dict or list[dict], optional): Initialization config dict. | |||
Defaults to None. | |||
Defaults: None |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Defaults to None
@BIGWangYuDong Thank you for your suggestion. I update the code with your suggestion |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There are some questions in this PR,
Default to
->Defaults to
. It should beDefaults to
in docstrings. (Need to check in this PR)- It is suggested to add docstrings in these python files, I give you some example, which may help you,
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
…nal_encoding (open-mmlab#9346) * [Fix]:add type hints for res_layer * Update mmdet/models/layers/res_layer.py Co-authored-by: BigDong <[email protected]> * [Fix]:add type hint of norm1 and norm2 * [WIP]:add res_layer type hints * [WIP]:add layer type hints about issue 9234 * [FIX]:add all type hints and change some function note * [FIX]:add docstrings and change default -> defaults Co-authored-by: BigDong <[email protected]>
Thanks for your contribution and we appreciate it a lot. The following instructions would make your pull request more healthy and more easily get feedback. If you do not understand some items, don't worry, just make the pull request and seek help from maintainers.
Motivation
超级视客营mmdet基础任务,选择链接中任一 part 代码添加 type hints,为part6的res_layer添加type hints
Modification
为其中的Reslayer和SimplifiedBasicBlock添加了type hint
Please briefly describe what modification is made in this PR.
BC-breaking (Optional)
Does the modification introduce changes that break the backward-compatibility of the downstream repos?
If so, please describe how it breaks the compatibility and how the downstream projects should modify their code to keep compatibility with this PR.
Use cases (Optional)
If this PR introduces a new feature, it is better to list some use cases here, and update the documentation.
Checklist