一: what is Activation Function?

       它仅仅是一个函数

二:why we use Activation function with Neural Networks?

      it map the resulting values in betwwen 0 to 1 or -1 to 1

      **函数大体可以分为两类:线性和非线性

     线性函数的输出一般不受限制,他不能处理复杂多变的神经网络参数。所以**函数一般都是非线性的,它让模型可以很容易泛化或者适应各种数据,并区分输出。

三 Sigmoid function

   sigmoid looks like S-shape

 

Activation function in Nerual Networks

the main reason why we use sigmoid function is because it exists between 0 to 1,因此它很适合 predict the probability, since probability of anything exists only between 0 to 1

this function can cause a neural network to get stuck at the training time,the softmax function is a more function which is used for multiclass classfication.

四: tanh

Activation function in Nerual Networks

the advantage is that the negative inputs will ba mapped strongly negative and the zero inputs will be mapped near zero .

五:ReLU(Rectified Linear Unit)

Activation function in Nerual Networks

Relu is the most used activation function in world right now ,

Range:[0 ,infinity)

Activation function in Nerual Networks

相关文章: