In neural networks, nonlinear activation functions such as sigmoid, tanh, and ReLU:()
In neural networks, nonlinear activation functions such as sigmoid, tanh, and ReLU:()
Speed up the gradient calculation in backpropagation,as compared to linear units
Help to learn nonlinear decision boundaries
Are applied only to the output units
Always output values between 0 and 1