Skip to content

Commit d48ca25

Browse files
committed
whenxuan: update the readme
1 parent 0d81706 commit d48ca25

File tree

1 file changed

+5
-6
lines changed

1 file changed

+5
-6
lines changed

README.md

Lines changed: 5 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -47,30 +47,29 @@ When the number of input channels is small, the channel attention mechanism is v
4747

4848
## Modules <a id="Modules"></a>
4949

50-
#### 1. [`SEAttention`](https://github.com/wwhenxuan/Channel-Attention/blob/main/channel_attention/squeeze_excitation.py): [[paper]]() The Squeeze-and-Excitation Attention with Global Average Pooling and Feed Forward Network.
50+
#### 1. [`SEAttention`](https://github.com/wwhenxuan/Channel-Attention/blob/main/channel_attention/squeeze_excitation.py): [[paper]](https://arxiv.org/abs/1709.01507) The Squeeze-and-Excitation Attention with Global Average Pooling and Feed Forward Network.
5151

5252
<div align="center">
5353
<img width="80%" src="images/SEAttention.png">
5454
</div>
5555

56-
#### 2. [`ChannelAttention`](https://github.com/wwhenxuan/Channel-Attention/blob/main/channel_attention/channel_attention.py): [[paper]]() The Channel Attention with Global Average Pooling and Global Max Pooling.
56+
#### 2. [`ChannelAttention`](https://github.com/wwhenxuan/Channel-Attention/blob/main/channel_attention/channel_attention.py): [[paper]](https://arxiv.org/abs/1807.06521) The Channel Attention with Global Average Pooling and Global Max Pooling.
5757

5858
<div align="center">
5959
<img width="80%" src="images/ChannelAttention.png">
6060
</div>
6161

62-
#### 3. [`SpatialAttention`](https://github.com/wwhenxuan/Channel-Attention/blob/main/channel_attention/spatial_attention.py): [[paper]]() The Spatial Attention with Global Average Pooling and Global Max Pooling.
62+
#### 3. [`SpatialAttention`](https://github.com/wwhenxuan/Channel-Attention/blob/main/channel_attention/spatial_attention.py): [[paper]](https://arxiv.org/abs/1807.06521) The Spatial Attention with Global Average Pooling and Global Max Pooling.
6363

6464
<div align="center">
6565
<img width="80%" src="images/SpatialAttention.png">
6666
</div>
6767

68-
#### 4. [`ConvBlockAttention`](https://github.com/wwhenxuan/Channel-Attention/blob/main/channel_attention/spatial_attention.py): [[paper]]() The Convolutional Block Attention Module (CBAM) combining Channel Attention and Spatial Attention.
68+
#### 4. [`ConvBlockAttention`](https://github.com/wwhenxuan/Channel-Attention/blob/main/channel_attention/spatial_attention.py): [[paper]](https://arxiv.org/abs/1807.06521) The Convolutional Block Attention Module (CBAM) combining Channel Attention and Spatial Attention.
6969

7070
<div align="center">
7171
<img width="80%" src="images/ConvBlockAttention.png">
7272
</div>
7373

74+
## Experiments <a id="Experiments"></a>
7475

75-
76-
## Experiments <a id="Experiments"></a>

0 commit comments

Comments
 (0)