卷積操作可視化

圖片來自這裏

基礎卷積模塊

  • standard convolution
圖片來自這裏

標準卷積的計算量爲HWNK²M,可以分爲3部分

(1) the spatial size of the input feature map HxW,

(2) the size of convolution kernel K²

 (3) the numbers of input and output channels NxM.

  • group convolution
圖片來自這裏
  • Depthwise Convolution

圖片來自這裏
  • Pointwise Convolution
圖片來自這裏
  • Depthwise separable convolution
圖片來自這裏
  • channel Shuffle
圖片來自這裏,(a) Two Stacked Group Convolutions (GConv1 & GConv2), (b) Shuffle the channels before convolution, (c) Equivalent implementation of (b)

卷積操作變體

  • Resnet
圖片來自這裏

Residual blocks — Building blocks of ResNet文章較詳細的介紹了skip connection及殘差網絡名稱的由來。

Let us consider a neural network block, whose input is x and we would like to learn the true distribution H(x). Let us denote the difference (or the residual) between this as

R(x) = Output — Input = H(x) — x

Rearranging it, we get,

H(x) = R(x) + x

Our residual block is overall trying to learn the true output, H(x) and if you look closely at the image above, you will realize that since we have an identity connection coming due to x, the layers are actually trying to learn the residual, R(x). So to summarize, the layers in a traditional network are learning the true output (H(x))whereas the layers in a residual network are learning the residual (R(x)). Hence, the name: Residual Block
  • ResNeXt

圖片來自這裏

該網絡結構與inception及group convolution的視圖如下。

圖片來自這裏,(a)resnext (b)inception (c)group conv
  • Squeeze-and-Excitation (SE) Block

圖片來自這裏

參考文獻:

  1. Why MobileNet and Its Variants (e.g. ShuffleNet) Are Fast

  2. Review: IGCNet / IGCV1 — Interleaved Group Convolutions (Image Classification)

  3. A Basic Introduction to Separable Convolutions

  4. 深度學習-MobileNet (Depthwise separable convolution)

  5. Review: ShuffleNet V1 — Light Weight Model (Image Classification)

  6. Residual blocks — Building blocks of ResNet

  7. An Overview of ResNet and its Variants

  8. Review: SENet — Squeeze-and-Excitation Network, Winner of ILSVRC 2017 (Image Classification)

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章