楼主: Nicolle
574 33

Activation Functions: Neural Networks [推广有奖]

版主

巨擘

0%

还不是VIP/贵宾

-

TA的文库  其他...

Python Programming

SAS Programming

Structural Equation Modeling

威望
15
论坛币
12220799 个
学术水平
2709 点
热心指数
2596 点
信用等级
2515 点
经验
415628 点
帖子
17090
精华
73
在线时间
6468 小时
注册时间
2005-4-23
最后登录
2018-5-26

Nicolle 学生认证  发表于 2018-2-13 10:52:54 |显示全部楼层

本帖隐藏的内容

Activation Functions-Neural Networks.pdf (633.92 KB)



已有 1 人评分经验 收起 理由
zl89 + 160 精彩帖子

总评分: 经验 + 160   查看全部评分

本帖被以下文库推荐

stata SPSS
Nicolle 学生认证  发表于 2018-2-13 10:53:17 |显示全部楼层
  1. What is Activation Function ?
  2. It’s just a thing (node) that you add to the output end of any neural network. It is also known as Transfer Function. It can also be attached in between two Neural Networks.
复制代码
回复

使用道具 举报

Nicolle 学生认证  发表于 2018-2-13 10:54:26 |显示全部楼层
  1. Why we use Activation functions with Neural Networks?
  2. It is used to determine the output of neural network like yes or no. It maps the resulting values in between 0 to 1 or -1 to 1 etc. (depending upon the function).
  3. The Activation Functions can be basically divided into 2 types-
  4. Linear Activation Function
  5. Non-linear Activation Functions
复制代码
回复

使用道具 举报

Nicolle 学生认证  发表于 2018-2-13 10:56:24 |显示全部楼层
  1. 1. Sigmoid or Logistic Activation Function
  2. The Sigmoid Function curve looks like a S-shape.

  3. Fig: Sigmoid Function
  4. The main reason why we use sigmoid function is because it exists between (0 to 1). Therefore, it is especially used for models where we have to predict the probability as an output.Since probability of anything exists only between the range of 0 and 1, sigmoid is the right choice.
  5. The function is differentiable.That means, we can find the slope of the sigmoid curve at any two points.
  6. The function is monotonic but function’s derivative is not.
  7. The logistic sigmoid function can cause a neural network to get stuck at the training time.
  8. The softmax function is a more generalized logistic activation function which is used for multiclass classification.
复制代码
回复

使用道具 举报

Nicolle 学生认证  发表于 2018-2-13 10:59:39 |显示全部楼层
  1. 2. Tanh or hyperbolic tangent Activation Function
  2. tanh is also like logistic sigmoid but better. The range of the tanh function is from (-1 to 1). tanh is also sigmoidal (s - shaped).

  3. Fig: tanh v/s Logistic Sigmoid
  4. The advantage is that the negative inputs will be mapped strongly negative and the zero inputs will be mapped near zero in the tanh graph.
  5. The function is differentiable.
  6. The function is monotonic while its derivative is not.
  7. The tanh function is mainly used classification between two classes.
  8. Both tanh and logistic sigmoid activation functions are used in feed-forward nets.
复制代码

回复

使用道具 举报

Nicolle 学生认证  发表于 2018-2-13 11:01:03 |显示全部楼层
  1. 3. ReLU (Rectified Linear Unit) Activation Function
  2. The ReLU is the most used activation function in the world right now.Since, it is used in almost all the convolutional neural networks or deep learning.

  3. Fig: ReLU v/s Logistic Sigmoid
  4. As you can see, the ReLU is half rectified (from bottom).It is f(s) is zero when z is less than zero and f(z) is equal to z when z is above or equal to zero.
  5. Range: [ 0 to infinity)
  6. The function and its derivative both are monotonic.
  7. But the issue is that all the negative values become zero immediately which decreases the ability of the model to fit or train from the data properly. That means any negative input given to the ReLU activation function turns the value into zero immediately in the graph, which in turns affects the resulting graph by not mapping the negative values appropriately.
复制代码

回复

使用道具 举报

啸傲江弧 发表于 2018-2-13 11:03:44 |显示全部楼层
谢谢分享
回复

使用道具 举报

sqy 发表于 2018-2-13 11:05:21 |显示全部楼层
顶!!!!!
回复

使用道具 举报

hjtoh 发表于 2018-2-13 11:12:08 来自手机 |显示全部楼层
Nicolle 发表于 2018-2-13 10:52
**** 本内容被作者隐藏 ****
谢谢分享
回复

使用道具 举报

fengyg 企业认证  发表于 2018-2-13 11:54:22 |显示全部楼层
kankan
回复

使用道具 举报

您需要登录后才可以回帖 登录 | 我要注册

GMT+8, 2018-5-27 07:36