论文地址:Attention U-Net: Learning Where to Look for the Pancreas

 

这是一篇使用attention模块对U-Net进行改进,从而进行医学图像分割的论文

 

这个attention模块其实给我感觉特别像SENet里边对每个channel进行权重加权的模块,只不过这篇文章是对feature map中每个pixel进行加权,以达到突出显著特征的作用

 

改进后的U-Net如图所示

[深度学习从入门到女装]Attention U-Net: Learning Where to Look for the Pancreas

attention模块用在了skip connection上,原始U-Net只是单纯的把同层的下采样层的特征直接concate到上采样层中,改进后的使用attention模块对下采样层同层和上采样层上一层的特征图进行处理后再和上采样后的特征图进行concate

 

attention模块

[深度学习从入门到女装]Attention U-Net: Learning Where to Look for the Pancreas

F代表哪一层的channel数,通俗来看就是使用下采样层同层的特征图和上采样层上一层的特征图进行一个pixel权重图的构建,然后再把这个权重图对下采样层同层的特征图进行处理,得到进行权重加权的特征图

使用公式来表达就是:[深度学习从入门到女装]Attention U-Net: Learning Where to Look for the Pancreas

 

下采样层同层的特征图[深度学习从入门到女装]Attention U-Net: Learning Where to Look for the Pancreas,进行1*1*1卷积运算得到[深度学习从入门到女装]Attention U-Net: Learning Where to Look for the Pancreas

上采样层上一层的特征图[深度学习从入门到女装]Attention U-Net: Learning Where to Look for the Pancreas,进行1*1*1卷积运算得到[深度学习从入门到女装]Attention U-Net: Learning Where to Look for the Pancreas

将上两步得到的特征图[深度学习从入门到女装]Attention U-Net: Learning Where to Look for the Pancreas[深度学习从入门到女装]Attention U-Net: Learning Where to Look for the Pancreas进行相加后在进行ReLu得到[深度学习从入门到女装]Attention U-Net: Learning Where to Look for the Pancreas[深度学习从入门到女装]Attention U-Net: Learning Where to Look for the Pancreas),[深度学习从入门到女装]Attention U-Net: Learning Where to Look for the Pancreas为ReLU**函数

随后在使用1*1*1卷积运算得到[深度学习从入门到女装]Attention U-Net: Learning Where to Look for the Pancreas

最后对[深度学习从入门到女装]Attention U-Net: Learning Where to Look for the Pancreas进行sigmoid**函数得到最终的attention coeffficients([深度学习从入门到女装]Attention U-Net: Learning Where to Look for the Pancreas

 

梯度传导过程如下:

[深度学习从入门到女装]Attention U-Net: Learning Where to Look for the Pancreas

 

相关文章: