Depth learning portal pdf download

brief introduction  

 

 

This book is a primer on the depth of learning the true meaning, in layman's language and analysis of the principles related technical depth learning. As used in Python3, try not to rely on external libraries or tools, from basic math knowledge, lead the reader to create a classic deep learning network from zero, so that readers in this process of gradually understanding the depth of learning. The book not only describes the basics of the concept of depth learning and neural networks, and other features, error back-propagation method, convolution neural network also has in-depth explanations, also covers the depth of learning practical skills relevant, autopilot, image generation application strengthen aspects of learning, and why add depth can improve the recognition accuracy "why" questions.

 

table of Contents  

Translator's Preface xiii
Preface xv
Chapter 1 1 Getting Started with Python
1.1 Python is what 1
1.2 Python installation 2
1.2.1 Python 2 version
1.2.2 external libraries used 2
1.2.3 Release Anaconda 3
1.3 4 Python interpreter
1.3. 1 4 arithmetic
1.3.2 data type 5
1.3.3 variable 5
1.3.4 6 listing
1.3.5 Dictionary 7
1.3.6 Boolean 7
1.3.7 IF statement 8
1.3.8 for statement 8
1.3.9 function. 9
1.4 the Python 9 script files
1.4.1 save file 9
1.4.2 class 10
1.5. 11 NumPy
1.5.1 introduced. 11 NumPy
1.5.2 generated NumPy array 12 is
1.5.3 NumPy arithmetic operation 12 is
N-dimensional array of 13 NumPy 1.5.4
1.5. 5 broadcasting 14
1.5.6 accessing element 15
Matplotlib 16 1.6
1.6.1 draw simple figures 16
1.6.2 17 pyplot function
1.6.3 display image 18
1.7 19 Summary
Chapter 2 perceptron 21
2.1 21 perceptron what
2.2 simple logic circuit 23
2.2.1 AND gate 23
2.2 .2 NAND and oR gates 23
2.3 25 machine to achieve perceived
2.3.1 simple implementation 25
2.3.2 26 introduced weights and bias
realization 2.3.3 using weights 26 and the bias
limitations of the perceptron 28 2.4
2.4 XOR gate 28 .1
2.4.2 linear and nonlinear 30
2.5 multilayer perceptron (31)
31 is a combination of the existing gates 2.5.1
2.5.2 exclusive oR gate 33 to achieve
2.6 to the computer from the NAND gate 35
2.7 36 Summary
neural networks Chapter 3 37
3.1 from perceptron neural network 37 to
an example of the neural network 37 3.1.1
3.1.2 38 perceptron review
3.1.3 activation function stage 40
3.2 activation function 42
3.2.1 Sigmoid function 42
3.2.2 43 step function
pattern 44 3.2.3 step function
implemented 3.2.4 sigmoid function 45
Comparative 3.2.5 sigmoid function and a step function 46
3.2. 6 nonlinear function 48
3.2.7 49 RELU function
calculating multidimensional array 50 3.3
3.3.1 multidimensional array 50
3.3.2 51 matrix multiplication
3.3.3 product neural network 55
implemented 3.4 3 layer neural network 56
3.4.1 symbol 57 confirmed
3.4.2 signal transfer between the layers to achieve 58
3.4.3 Summary 62 code for
the design of the output layer 63 3.5
64 3.5.1 softmax identity function and functions
3.5.2 precautions softmax function 66 implemented
3.5.3 characterized softmax function 67
3.5.4 number of neurons in the output layer 68 of
3.6 recognition of handwritten numerals 69
3.6.1 MNIST dataset 70
3.6.2 neural network inference processing 73
3.6.3 batch 75
79 3.7 Summary
Chapter 4, Neural Network Learning 81
4.1 81 learning from data
4.1.1 data driver 82
4.1.2 training data and test data 84
4.2 Loss Function 85
4.2.1 85 Mean Square Error
4.2.2 cross-entropy error 87
88 4.2.3 mini-batch learning
achieved 4.2.4 mini-batch version of the cross-entropy errors 91
4.2.5 why loss function set 92
4.3 94 Numerical Differentiation
4.3.1 derivative 94
examples 96 numerical differentiation 4.3.2
4.3. 3 partial derivative 98
4.4 gradient from 100
102 4.4.1 gradient
gradient 4.4.2 neural network 106
4.5 109 implemented learning algorithm
4.5.1 2 layer neural network class 110
to achieve 4.5.2 mini-batch of 114
4.5.3 based evaluation test data 116
4.6 118 Summary
Chapter 5 back propagation method 121
5.1 121 calculates FIG
5.1.1 calculation FIG Solution 122
5.1.2 Local area 124
5.1.3 FIG calculated by solving why 125
5.2 126 chain rule
backpropagation computation graph 127 5.2.1
5.2.2 What is the chain rule 127
5.2.3 Calculation chain rule and 129
5.3 backpropagation 130.
130. the summing node 5.3.1 backpropagation
backpropagation multiplication node 132 5.3.2
examples Apple 5.3.3 133
5.4 135 layer is simple
to achieve multiplication layer 135 5.4.1
5.4.2 adder layer 137 is implemented
to realize the function activation layer 5.5 139
5.5.1 RELU layer 139
5.5.2 the Sigmoid layer 141 is
5.6 AffineSoftmax layer 144 implemented
5.6.1 Affine layer 144
5.6.2 batch version Affine layer 148
5.6.3-with the Softmax -Loss layer 150
to achieve error back propagation method 5.7 154
5.7.1 neural network learning 154 of FIG picture
5.7.2 corresponding to implement neural network error back propagation method 155
gradient back propagation method 5.7.3 acknowledgment 158
5.7.4 using the error back-propagation method of learning 159
5.8 Summary 161
Chapter 6 associated with learning skills 163
6.1 update parameter 163
6.1.1 explorer's story 164
6.1.2 SGD 164
shortcomings 6.1.3 SGD 166
6.1 the Momentum 168 .4
6.1.5 AdaGrad 170.
6.1.6 Adam 172
6.1.7 174 which update method to use it
6.1.8 updating method based on the comparison data set 175 MNIST
weight of the initial value of the weights 176 6.2
6.2.1 heavy initial weight may a value of 0 it 176
177 6.2.2 distribution of the activation value of the hidden layer
6.2.3 ReLU weight initial value of 181
heavy initial value based on the data set right MNIST 183 6.2.4
6.3 Batch Normalization 184
6.3.1 Batch Normalization of algorithm 184
evaluation 6.3.2 Batch Normalization 186
6.4 188 regularization
6.4.1 overfitting 189
6.4.2 191 weights the attenuation
Dropout 192 6.4.3
6.5 verify hyperparameter 195
6.5.1 authentication data 195
6.5.2 optimized over 196 parameters
to achieve optimal parameters 6.5.3 Super 198
6.6 Summary 200
Chapter 7 convolutional neural network 201
7.1 overall structure 201
7.2 convolutional layer 202
present full-7.2.1 connection layer 203 issues
7.2.2 convolution 203
7.2.3 206 filled
7.2.4 steps 207
7.2.5 3-dimensional convolution operation data 209
7.2.6 block binding Reflection 211
7.2.7 batch 213
7.3 cell layer 214
to achieve 7.4 convolution layer and pooled layer 216
7.4.1 4-dimensional array 216
7.4.2 217 based on the expanded im2col
implemented convolutional layer 219 7.4.3
7.4. 4 to achieve pooled layer 222
7.5 CNN implementation 224
7.6 228 CNN visual
7.6.1 first layer 228 weight visualization
7.6.2 hierarchical structure based on the extracted information 230
7.7 231 representative CNN
7.7.1 LeNet 231
7.7.2 AlexNet 232
7.8 233 Summary
Chapter 8 deep learning 235
8.1 enhance network 235
8.1.1 proceed to the deeper network 235
8.1.2 238 to further improve the recognition accuracy
8.1.3 plus deeper motive 240
8.2 depth study of the history of the small 242
8.2.1 ImageNet 243
8.2.2 VGG 244
8.2.3 GoogLeNet 245
8.2.4 ResNet 246
speed 248 8.3 depth study
8.3.1 problems to be trying to solve 248
8.3. GPU-based speed-2 249
8.3.3 250 distributed learning
8.3.4 reduction operation precision bits 252
8.4 253 depth study applications
8.4.1 object detector 253
8.4.2 255 image segmentation
8.4.3 generated video title 256
future 258 8.5 depth study
8.5.1 style image transform 258
8.5.2 image generation 259
Autopilot 8.5.3 261
8.5.4 Deep-Q the Network (reinforcement learning) 262
8.6 264 Summary
Appendix A calculation map Softmax-with-Loss layer 267
A.1 forward propagating 268
A.2 backpropagation 270
A.3 Summary 277
References 279

 Download Link

Guess you like

Origin www.cnblogs.com/pythongood/p/11112057.html