Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] Add attention module of CBAM #246

Merged
merged 26 commits into from
Nov 10, 2022
Merged

[Feature] Add attention module of CBAM #246

merged 26 commits into from
Nov 10, 2022

Conversation

kitecats
Copy link
Contributor

@kitecats kitecats commented Nov 5, 2022

Thanks for your contribution and we appreciate it a lot. The following instructions would make your pull request more healthy and more easily get feedback. If you do not understand some items, don't worry, just make the pull request and seek help from maintainers.

Motivation

Add attention module of CBAM

Modification

  1. Add attention module of CBAM in mmyolo/models/layers/cbam_layer.py.
  2. Add tutorials on the use of the attention module in how_to.

@PeterH0323
Copy link
Collaborator

Pls fix the lint

mmyolo/models/layers/cbam_layer.py Outdated Show resolved Hide resolved
mmyolo/models/layers/cbam_layer.py Outdated Show resolved Hide resolved
docs/zh_cn/advanced_guides/how_to.md Outdated Show resolved Hide resolved
docs/zh_cn/advanced_guides/how_to.md Outdated Show resolved Hide resolved
mmyolo/models/plugins/cbam.py Outdated Show resolved Hide resolved
mmyolo/models/plugins/cbam.py Outdated Show resolved Hide resolved
mmyolo/models/plugins/cbam.py Outdated Show resolved Hide resolved
mmyolo/models/plugins/cbam.py Outdated Show resolved Hide resolved
super(ChannelAttention, self).__init__()
self.avg_pool = nn.AdaptiveAvgPool2d(1)
self.max_pool = nn.AdaptiveMaxPool2d(1)
self.conv1 = ConvModule(
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

conv1 和 conv2 写成 nn.seq 比较好

mmyolo/models/plugins/cbam.py Outdated Show resolved Hide resolved
mmyolo/models/plugins/cbam.py Outdated Show resolved Hide resolved
mmyolo/models/plugins/cbam.py Outdated Show resolved Hide resolved
docs/zh_cn/advanced_guides/plugins.md Outdated Show resolved Hide resolved
docs/zh_cn/advanced_guides/plugins.md Show resolved Hide resolved
def __init__(self,
channels: int,
ratio: int = 16,
act_cfg: Union[int, Sequence[int]] = (dict(type='ReLU'),
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

type=relu,inplace=True

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Union[int, Sequence[int]]?

conv_cfg=None,
act_cfg=None)

self.activate1 = MODELS.build(act_cfg[0])
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Activate可以合并到convmodule中

CBAM(
in_channels=16,
act_cfg=dict(
ChannelAttention=(dict(type='ReLU'), ),
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
ChannelAttention=(dict(type='ReLU'), ),
ChannelAttention=(dict(type='ReLU')),

Comment on lines 19 to 20
stages=(False, False, True, True)),
], ))
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
stages=(False, False, True, True)),
], ))
stages=(False, False, True, True))
]))

def __init__(self,
channels: int,
ratio: int = 16,
act_cfg: Union[dict,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

act_cfg 里面套两个激活,其实没有这种写法,不推荐



class ChannelAttention(BaseModule):
"""ChannelAttention
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
"""ChannelAttention
"""ChannelAttention



class SpatialAttention(BaseModule):
"""SpatialAttention
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
"""SpatialAttention
"""SpatialAttention

@hhaAndroid hhaAndroid merged commit 825d05d into open-mmlab:dev Nov 10, 2022
hhaAndroid pushed a commit that referenced this pull request Nov 10, 2022
* Add Attention Modules

* Adde tutorials on the use of the attention module in How_to

* Update how_to.md

Added tutorials on the use of the attention module

* Update attention_layers.py

* Rename attention_layers.py to cbam_layer.py

* Update __init__.py

* Update how_to.md

* Update how_to.md

* Update how_to.md

* Update cbam_layer.py

* Update cbam_layer.py

* Update cbam_layer.py

* Update how_to.md

* update

* add docstring typehint

* add unit test

* refine unit test

* updata how_to

* add plugins directory

* refine plugin.md

* refine cbam.py and plugins.md

* refine cbam.py and plugins.md

* fix error in test_cbam.py

* refine cbam.py and fix error in test_cbam.py

* refine cbam.py and plugins.md

* refine cbam.py and docs
@RangeKing RangeKing mentioned this pull request Nov 20, 2022
32 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants