• 签到
• 苹果/安卓/wp

• 苹果/安卓/wp

客户端

160 2

# Naive Bayes: A Baseline Model for Machine Learning Classification Performance [推广有奖]

87%

-

TA的文库  其他...

6

632695 个

22387.4456

1396 点

1509 点

1305 点

328879 点

8915

66

4957 小时

2007-5-21

2019-7-17

oliyiyi 发表于 2019-5-8 19:42:49 |显示全部楼层

#### 本帖隐藏的内容

By Asel Mendis, KDnuggets. comments Bayes Theorem

The above equation represents Bayes Theorem in which it describes the probability of an event occurring P(A) based on our prior knowledge of events that may be related to that event P(B).

Lets explore the parts of Bayes Theorem:

• P(A|B) - Posterior Probability
• The conditional probability that event A occurs given that event B has occurred.
• P(A) - Prior Probability
• The probability of event A.
• P(B) - Evidence
• The probability of event B.
• P(B|A) - Likelihood
• The conditional probability of B occurring given event A has occurred.

Now, lets explore the parts of Bayes Theorem through the eyes of someone conducting machine learning:

• P(A|B) - Posterior Probability
• The conditional probability of the response variable (target variable) given the training data inputs.
• P(A) - Prior Probability
• The probability of the response variable (target variable).
• P(B) - Evidence
• The probability of the training data.
• P(B|A) - Likelihood
• The conditional probability of the training data given the response variable.

• P(c|x) - Posterior probability of the target/class (c) given predictors (x).
• P(c) - Prior probability of the class (target).
• P(x|c) - Probability of the predictor (x) given the class/target (c).
• P(x) - Prior probability of the predictor (x).

Example of using Bayes theorem:
I'll be using the tennis weather dataset.   