Skip to content

Commit c29a8d9

Browse files
committed
whenxuan: update the readme
1 parent 970cdbd commit c29a8d9

3 files changed

Lines changed: 8 additions & 2 deletions

File tree

README.md

Lines changed: 8 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ A plug-and-play channel attention mechanism module implemented in PyTorch.
44

55
<div align="center">
66

7-
[Installation](#Installation) | [Usage](#Usage) | [Modules](#Modules) | [Blog](https://mp.weixin.qq.com/s/D6O5SBl2RYHdkiinV6UM8w)
7+
[Installation](#Installation) | [Usage](#Usage) | [Modules](#Modules) | [Blog](https://mp.weixin.qq.com/s/D6O5SBl2RYHdkiinV6UM8w) | [Experiments](#Experiments)
88
</div>
99

1010
<div align="center">
@@ -42,4 +42,10 @@ We only develop and test with PyTorch. Please make sure to install it from [PyTo
4242
<img width="80%" src="images/ChannelAttention.png">
4343
</div>
4444

45-
#### 3. [`ECAAttention`]()
45+
#### 3. [`SpatialAttention`](https://github.com/wwhenxuan/Channel-Attention/blob/main/channel_attention/spatial_attention.py): [[paper]]() The Spatial Attention with Global Average Pooling and Global Max Pooling.
46+
<div style="text-align: center;">
47+
<img width="80%" src="images/SpatialAttention.png">
48+
</div>
49+
50+
51+
## Experiments <a id="Experiments"></a>

images/ChannelAttention.png

-112 KB
Loading

images/SpatialAttention.png

72.1 KB
Loading

0 commit comments

Comments
 (0)