♥,.*,.♥,.*,.♥,.*,.♥,.*♥,.*,.♥,.*,.♥,.*,.♥,.*,.♥,.*,.♥,.*,.♥,.*,.♥♥,.*,.♥,.*,.♥,.*,.♥,.*♥,.*,.♥,.*,.♥,.*,.♥,.*,.♥,.*,.♥,.*,.♥,.*,.♥

Table of Contents

1.LeNet-5(1990)

2.AlexNet(2012)

特点1:ReLU(Rectified Linear Unit) Nonlinearity

特点2:Data Augmentation(Reduce Overfitting)

特点3:Dropout(Reduce Overfitting)

3.VGGNet(2014)

特点1:The Use of 1×1 and 3×3 Filters(Reduce the number of parameters)

特点2:Data Augmentation(Multi-Scale)

4.InceptionNet(2014)

特点1:Inception Module

特点2:Global Average Pooling

特点3:Auxiliary Classifiers for Training

5.ResNet(2015)

♥,.*,.♥,.*,.♥,.*,.♥,.*♥,.*,.♥,.*,.♥,.*,.♥,.*,.♥,.*,.♥,.*,.♥,.*,.♥♥,.*,.♥,.*,.♥,.*,.♥,.*♥,.*,.♥,.*,.♥,.*,.♥,.*,.♥,.*,.♥,.*,.♥,.*,.♥

1.LeNet-5(1990)

Yann LeCun提出。

其结构为:

Classic Convolutional Neural Networks (经典卷积神经网络)

Classic Convolutional Neural Networks (经典卷积神经网络)

LeNet-5网络由5层组成,两组卷积层+池化曾,2层全连接层和1层softmax分类层。

2.AlexNet(2012)

Hinton+Alex Krizhevsky提出。

其结构为:

Classic Convolutional Neural Networks (经典卷积神经网络)

Classic Convolutional Neural Networks (经典卷积神经网络)

AlexNet由8层组成,5层卷积层和3层全连接层。每层卷积层后面都有池化层。

特点1:ReLU(Rectified Linear Unit) Nonlinearity

AlexNet的优点是使用了Relu**。

Relu函数为:

Classic Convolutional Neural Networks (经典卷积神经网络)

Using ReLU nonlinearity, CNNs could be trained much faster.

Reasons:

  1. No complicated math.
  2. It converges faster. The slope doesn’t plateau when x gets large.
  3. It’s sparsely activated. Since ReLU is zero for all negative inputs, some units may not be activated.

特点2:Data Augmentation(Reduce Overfitting)

AlexNet对图像进行了数据增强。

通过对图像进行镜像对称或者随机裁剪图片,增加了训练的难度,也使得模型的鲁棒性增强。

Data Augmentation by Mirroring

Classic Convolutional Neural Networks (经典卷积神经网络)

Data Augmentation by Random Crops

Classic Convolutional Neural Networks (经典卷积神经网络)

特点3:Dropout(Reduce Overfitting)

Classic Convolutional Neural Networks (经典卷积神经网络)

A neuron is dropped from the network with a probability of 0.5.

以0.5的概率断开某些神经元的连接(即从全连接变为非全连接)。

通过Dropout操作也可以提高模型的准确性。

3.VGGNet(2014)

University of Oxford+Google DeepMind提出。

其结构为:

Classic Convolutional Neural Networks (经典卷积神经网络)

特点1:The Use of 1×1 and 3×3 Filters(Reduce the number of parameters)

两个3x3卷积核叠加,相当于一个5x5的卷积核的覆盖范围。但是训练的参数减少了。

Classic Convolutional Neural Networks (经典卷积神经网络)

By using 2 layers of 3×3 filters, it actually have already covered 5×5 area as in the above figure.

filters

number of parameters

1 layer of 5×5 filter

5×5=25

2 layers of 3×3 filters

3×3+3×3=18

可以看到,一个5x5的卷积核需要训啦25个参数,而2层3x3的卷积核需要训练的参数为18个。训练的参数减少,可以加快模型的训练速度,同时避免模型的过拟合。

特点2:Data Augmentation(Multi-Scale)

通过随机将图像裁剪成224*224尺寸;或者将图片缩放成256×256、284*384尺寸,实现数据的增强。使得模型更加准确。

Classic Convolutional Neural Networks (经典卷积神经网络)

4.InceptionNet(2014)

Google提出。

Classic Convolutional Neural Networks (经典卷积神经网络)

这个模型有22层。

特点1:Inception Module

Classic Convolutional Neural Networks (经典卷积神经网络)

  1. use 1×1 and 3×3(or 5×5) filters to reduce the number of parameters.
  2. there are different sizes of convolutions for each layer. It can extract different kinds of features.

特点2:Global Average Pooling

Classic Convolutional Neural Networks (经典卷积神经网络)

特点3:Auxiliary Classifiers for Training

Classic Convolutional Neural Networks (经典卷积神经网络)

The loss is added to the total loss, with weight 0.3.

5.ResNet(2015)

Microsoft Research提出,其在网络中加入了残差。

Classic Convolutional Neural Networks (经典卷积神经网络)

其主要的模块为:

Classic Convolutional Neural Networks (经典卷积神经网络)

R(x) = Output — Input = H(x) — x

H(x) = R(x) + x

图中的F(x),上式中的R(x)即为残差。

Classic Convolutional Neural Networks (经典卷积神经网络)

ResNet 2 layer and 3 layer Block

♥,.*,.♥,.*,.♥,.*,.♥,.*♥,.*,.♥,.*,.♥,.*,.♥,.*,.♥,.*,.♥,.*,.♥,.*,.♥♥,.*,.♥,.*,.♥,.*,.♥,.*♥,.*,.♥,.*,.♥,.*,.♥,.*,.♥,.*,.♥,.*,.♥,.*,.♥

广告时间:

本宝宝开通了一个公众号,记录日常的深度学习和强化学习笔记。

希望大家可以共同进步,嘻嘻嘻!求关注,爱你呦!

Classic Convolutional Neural Networks (经典卷积神经网络)

相关文章: