Deep learning classic network structure && design principles



1. Classic network structure

ImageNet is a classification contest, the classification on this.
The network below is mainly for image classification.

Insert picture description here




1.Alexnet

AlexNet has 60 million parameters and 65,000 neurons, five-layer convolution, three-layer fully connected network, and the final output layer is a 1000-channel softmax. AlexNet uses two GPUs for calculations, which greatly improves the computational efficiency. In the ILSVRC-2012 competition, it obtained a top-5 error rate of 15.3%. The second place method has an error rate of 26.2%. It can be said that the difference is It was very big, enough to explain the impact of this network on academia and industry at that time.

Insert picture description here

2.VGG

Insert picture description here
Insert picture description here

3.GoogLeNet (Inception V1,V2,V3)

Insert picture description here



Global average pooling is proposed. In structures such as VGG, the fully connected layer occupies a lot of parameters. Therefore, in Goodnet, it is proposed to use global average pooling to replace the fully connected layer.

To put it bluntly, it is to average from the feature map. In this way, it has been proved that the accuracy has not been greatly reduced.

Insert picture description here

Two auxiliary classifiers are also added.
When the gradient is backpropagated, the gradient will disappear. At this time, the auxiliary classifier must be used to get the final correct result, and then go through it again.
The reason why GoogleNet is so wide is because he has taken all the considerations into consideration.

Insert picture description here




4.ResNet、ResNeXt

The residual neural network is equivalent to bringing up a part of the error, and then continuing to make the mistake.

Insert picture description here

Insert picture description here



2. Design principles

Insert picture description here
Insert picture description here

Insert picture description here

Guess you like

Origin blog.csdn.net/zhaozhao236/article/details/110162789