-
Notifications
You must be signed in to change notification settings - Fork 554
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature] Add attention module of CBAM #246
Conversation
Added tutorials on the use of the attention module
Pls fix the lint |
mmyolo/models/plugins/cbam.py
Outdated
super(ChannelAttention, self).__init__() | ||
self.avg_pool = nn.AdaptiveAvgPool2d(1) | ||
self.max_pool = nn.AdaptiveMaxPool2d(1) | ||
self.conv1 = ConvModule( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
conv1 和 conv2 写成 nn.seq 比较好
mmyolo/models/plugins/cbam.py
Outdated
def __init__(self, | ||
channels: int, | ||
ratio: int = 16, | ||
act_cfg: Union[int, Sequence[int]] = (dict(type='ReLU'), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
type=relu,inplace=True
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Union[int, Sequence[int]]?
mmyolo/models/plugins/cbam.py
Outdated
conv_cfg=None, | ||
act_cfg=None) | ||
|
||
self.activate1 = MODELS.build(act_cfg[0]) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Activate可以合并到convmodule中
CBAM( | ||
in_channels=16, | ||
act_cfg=dict( | ||
ChannelAttention=(dict(type='ReLU'), ), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ChannelAttention=(dict(type='ReLU'), ), | |
ChannelAttention=(dict(type='ReLU')), |
stages=(False, False, True, True)), | ||
], )) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
stages=(False, False, True, True)), | |
], )) | |
stages=(False, False, True, True)) | |
])) |
mmyolo/models/plugins/cbam.py
Outdated
def __init__(self, | ||
channels: int, | ||
ratio: int = 16, | ||
act_cfg: Union[dict, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
act_cfg 里面套两个激活,其实没有这种写法,不推荐
|
||
|
||
class ChannelAttention(BaseModule): | ||
"""ChannelAttention |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"""ChannelAttention | |
"""ChannelAttention | |
|
||
|
||
class SpatialAttention(BaseModule): | ||
"""SpatialAttention |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"""SpatialAttention | |
"""SpatialAttention | |
* Add Attention Modules * Adde tutorials on the use of the attention module in How_to * Update how_to.md Added tutorials on the use of the attention module * Update attention_layers.py * Rename attention_layers.py to cbam_layer.py * Update __init__.py * Update how_to.md * Update how_to.md * Update how_to.md * Update cbam_layer.py * Update cbam_layer.py * Update cbam_layer.py * Update how_to.md * update * add docstring typehint * add unit test * refine unit test * updata how_to * add plugins directory * refine plugin.md * refine cbam.py and plugins.md * refine cbam.py and plugins.md * fix error in test_cbam.py * refine cbam.py and fix error in test_cbam.py * refine cbam.py and plugins.md * refine cbam.py and docs
Thanks for your contribution and we appreciate it a lot. The following instructions would make your pull request more healthy and more easily get feedback. If you do not understand some items, don't worry, just make the pull request and seek help from maintainers.
Motivation
Add attention module of CBAM
Modification