Hi, nice work and thanks for the open code.
In the 3.3 Section of the paper,there are two Attention applied in the MCDB module,called a self-attention and a channel-wise attention, respectively.However,I can't find them in the code.Could you please tell me where attention is specifically located in the code?Looking forward to your reply, thanks.