ReLU
paper: Deep Sparse Rectifier Neural Networks
Leaky-ReLU
paper: Rectifier Nonlinearities Improve Neural Network Acoustic Models
ReLU6
paper: MobileNetV2: Inverted Residuals and Linear Bottlenecks
Swish
paper : Searching for Activation Functions
hard-Swish
paper : Searching for MobileNetV3
Mish
paper : Mish: A self Regularized Non-Monotonic Neural Activation Function
附加:
Sigmoid 激活函数
tanh函数
以e为底的指数函数:
注:exp(x)表示e的x次方,e=2.718281828...,