这篇教程keras高级激活层Advanced Activation写得很实用,希望能帮到您。
高级激活层Advanced Activation
- LeakyReLU:是ReLU的特殊版本,当不可激活时仍会有非零输出值,从而获得一个小梯度.
- PReLU:参数化的ReLU,f(x) = alpha * x for x < 0, f(x) = x for x>=0
- ELU:指数线性单元,
f(x) = alpha * (exp(x) - 1.) for x < 0 , f(x) = x for x>=0
- ThresholdedReLU:带有门限的ReLU,f(x) = x for x > theta,f(x) = 0 for x <= theta
Keras融合层Merge keras规范层BatchNormalization |