首页 > 试题广场 >

In neural networks, nonlinear

[单选题]

In neural networks, nonlinear activation functions such as sigmoid, tanh, and ReLU:()

  • Speed up the gradient calculation in backpropagation,as compared to linear units

  • Help to learn nonlinear decision boundaries

  • Are applied only to the output units

  • Always output values between 0 and 1

A:有时候会出现梯度消失,例如对于signmoid函数数据在X负轴很远或者正轴很远其梯度都为0,会导致计算很慢
C:在中间的网络层也会有激活函数
D:Relu的输出不是0到1区间
发表于 2022-02-21 19:17:45 回复(0)