【笔记】激活函数:Sigmoid、Tanh、ReLU、Leaky-ReLU、ReLU6、Swish、Hard-Swish、Mish、Softmax等

 

ReLU

paper: Deep Sparse Rectifier Neural Networks

 

Leaky-ReLU

paper: Rectifier Nonlinearities Improve Neural Network Acoustic Models

ReLU6

paper: MobileNetV2: Inverted Residuals and Linear Bottlenecks

Swish

paper : Searching for Activation Functions

hard-Swish

paper : Searching for MobileNetV3

 

Mish

paper : Mish: A self Regularized Non-Monotonic Neural Activation Function

 

 

附加:

 Sigmoid 激活函数

 tanh函数

 

 

 

 

 以e为底的指数函数:

 

 

 

 

 

 

注:exp(x)表示e的x次方,e=2.718281828...,

猜你喜欢

转载自blog.csdn.net/nyist_yangguang/article/details/121336746
今日推荐