Attention Mechanism in Convolutional Neural Networks

Attention Mechanism in Convolutional Neural Networks

Attention Mechanism in Convolutional Neural Networks.

The attention mechanism (Attention Mechanism) in the convolutional neural network is represented by calculating the corresponding statistics on a certain dimension of the feature , and assigning different weights to each element on the dimension according to the calculated statistics to enhance The expressive power of network features.

class Attention(nn.Module):
    def __init__(self, ):
        super(Attention, self).__init__()
        self.layer() = nn.Sequential()

    def forward(self, x):
        b, c, h, w = x.size()
        w = self.layer(x)         # 在某特征维度上计算权重
        return x * w.expand_as(x) # 对特征进行加权

The feature dimension of the convolutional layer includes the channel dimension $C$ and the space dimension $H, W$, so the attention mechanism can be applied in different dimensions:

  • Channel Attention : SENet , CMPT-SE , GENet , GSoP , SRM , SKNet , DIA , ECA-Net , S

Guess you like

Origin blog.csdn.net/universsky2015/article/details/131672299