Naïve Bayes Classifiers

1.1 Exact Bayes Classifier

We would like to classify categorical output $(k_1,k_2,…,k_3)$ given some attributes$(x_1, x_2, …, x_n)$

For example, we would like to predict the output is $k_1$ or $k_2$ given three attributes $A,B,C$

If $P(k_1|A, B, C)$ > $P(k_2|A, B, C)$

we would like to say A, B, C are more likely to belong to $k_1$; vice versa

Notation:

If A exists, A; if A does not exist, -A

If B exists, B; if B does not exist, -B

If C exists, C; if C does not exist, -C

Then, if we apply Bayes’ Theorm,

$$P(k_1|A, B, C)$$
=$$\frac{P(k_1)P(A,B,C|k_1)}{P(A,B,C)}$$

By applying total probability law,

$ \Longrightarrow$
$$\frac{P(k_1)P(A,B,C|k_1)}{P(k_1)P(A,B,C|k_1)+P(k_2)P(A,B,C|k_2)}$$

However, to calculate $P(A,B,C|k_1)$ needs $2^i$ spaces, where i = 3 in this case,
to calculate $P(A,B,C|k_2)$ needs another $2^2$ spaces

The frequency table is like below:

FreuencyA, B, CA, B, -CA, -B, CA, -B, -C-A, B, C-A, B, -C-A, -B, C-A, -B, C
k112345678
k298765432

Therefore, we introduce Naive Bayes Algorithm to reduce the storing space and computational speed.

# 1.2 Naive Bayes Classifier

We assume class conditional independence, so that

$P(A,B,C|k_1)$ is equal to $P(A|k_1)P(B|k_1)P(C|k_1)$

$P(A,B,C|k_2)$ is equal to $P(A|k_2)P(B|k_2)P(C|k_2)$

And now, we need only 2in records, where i is the number of attributes, and i being number of categorical output we will predict

FreuencyA-AB-BC-C
k1123456
k2765432

Therefore, our problem

$$P(k_1|A, B, C)$$
= $$\frac{P(k_1)P(A,B,C|k_1)}{P(k_1)P(A,B,C|k_1)+P(k_2)P(A,B,C|k_2)}$$
$\Longrightarrow$
$$\frac{P(k_1)[P(A|k_1)P(B|k_1)P(C|k_1)]}{P(k_1)[P(A|k_1)P(B|k_1)P(C|k_1)]+P(k_2)[P(A|k_2)P(B|k_2)P(C|k_2)]} \ (i)$$

$$P(k_2|A, B, C)$$
= $$\frac{P(k_2)P(A,B,C|k_2)}{P(k_1)P(A,B,C|k_1)+P(k_2)P(A,B,C|k_2)}$$
$\Longrightarrow$
$$\frac{P(k_2)[P(A|k_2)P(B|k_2)P(C|k_2)]}{P(k_1)[P(A|k_1)P(B|k_1)P(C|k_1)]+P(k_2)[P(A|k_2)P(B|k_2)P(C|k_2)]} \ (ii)$$

We notice that (i),(ii) share the same numerator, we can focus only on the denominator

$$P(k_1|A, B, C)$$
= $$\frac{P(k_1)P(A,B,C|k_1)}{P(k_1)P(A,B,C|k_1)+P(k_2)P(A,B,C|k_2)}$$
$\propto$
$$P(k_1)[P(A|k_1)P(B|k_1)P(C|k_1)]$$

$$P(k_2|A, B, C)$$
= $$\frac{P(k_2)P(A,B,C|k_2)}{P(k_1)P(A,B,C|k_1)+P(k_2)P(A,B,C|k_2)}$$
$\propto$
$$P(k_2)[P(A|k_2)P(B|k_2)P(C|k_2)]$$

If $P(k_1)[P(A|k_1)P(B|k_1)P(C|k_1)]$ > $P(k_2)[P(A|k_2)P(B|k_2)P(C|k_2)]$, we say that A, B, C are more likely to belong to $k_1$; vice versa

1.3 why not P(C | A, B) = P(C | A) * P(C | B)

From 1.2, we know that from Naive Bayes Algorithm, we assume class conditional independence, so that

$P(A,B | C)$ = $P( A | C) * P(B | C)$

buy why not diretly say that

$P(C | A, B)$ = $P(C | A) * P(C | B)$

This is because it happens only when P(C) = 0 or P(1), meaningless

$$P(C | A, B)=P(C | A) * P(C | B)$$

$\Longrightarrow$

$$\frac{P(C,(A,B))}{P(A,B)} = \frac{P(C,A)}{P(A)}*\frac{P(C,B)}{P(B)} $$

$\Longrightarrow if\ B\ and\ C\ are\ class\ conditional\ independent$

$$\frac{P(C)*P(A)*P(B)}{P(A)*P(B)} = \frac{P(C)*P(A)}{P(A)}*\frac{P(C)*P(B)}{P(B)} $$

$\Longrightarrow$

$$P(C) = P(C)*P(C)$$

where only possible if P(C) = 0, or, 1

Therefore, we use Bayes therom to swap $P(C | A, B)$ to $P(A, B | C) $ before applying naive bayes algorithm

2. Example

Consider the following 4 SMS messages:

messageLabel
I am not comingham
Good workham
Do you need viagraspam
win an IMacspam

2.1 Compute the prior probabilities of a new SMS message being ‘spam’ or ‘ham’.

Let $p(spam)$ be the probability of a new SMS message being “spam”

Let $p(ham)$ be the probability of a new SMS message being “ham”

Therefore
$$p(spam)= 0.5$$
$$p(ham)=0.5$$

2.2 For each de-capitalised keyword that appears in your training set (that is, ‘i’, ‘am’,‘not’, ‘coming’, ‘good’, ‘work’, ‘do’, ‘you’, ‘need’, ‘viagra’, ‘win’, ‘an’ and ‘imac’), build a frequency table that records the likelihoods P(W|ham), P(-W|ham), P(W|spam) and P(-W|spam).

Each de-capitalised keyword are put into two rows(word row, and -word row):

we mark the number of ham massage that the certain keyword exists on the (word row, ham column);

we mark the number of ham massage that the certain keyword does not exist on the (-word row, ham column);

we mark the number of spam massage that the certain keyword exists on the (word row, spam column);

we mark the number of spam massage that the certain keyword does not exist on the (-word row, spam column);

We can construct a frequency table following:

FrequencyHamSpam
-am12
-an21
-coming12
-do21
-good12
-i12
-imac21
-need21
-not12
-viagra21
-win21
-work12
-you21
am10
an01
coming10
do01
good10
i10
imac01
need01
not10
viagra01
win01
work10
you01

Then, to record the likelihoods P(W|ham), P(-W|ham), P(W|spam) and P(-W|spam), we divide each entry in ham column by 2 (the total number of ham messages), and divide each entry in spam column by 2 (the total number of spam messages).

In addition, to prevent the likelihood of 1 and 0, we replace any likelihood smaller than 0.05 (larger than 0.95) with 0.05 (0.95) by using one of the variants of the Laplace estimator.

Therefore, we get the following likelihood table:

Probability of the row name given the column nameHamSpam
-am0.500.95
-an0.950.50
-coming0.500.95
-do0.950.50
-good0.500.95
-i0.500.95
-imac0.950.50
-need0.950.50
-not0.500.95
-viagra0.950.50
-win0.950.50
-work0.500.95
-you0.950.50
am0.500.05
an0.050.50
coming0.500.05
do0.050.50
good0.500.05
i0.500.05
imac0.050.50
need0.050.50
not0.500.05
viagra0.050.50
win0.050.50
work0.500.05
you0.050.50

2.3 Predict if the following two SMS messages “Coming home ?” and “Get Viagra now” are ham or spam?

2.3.1 For message “coming home”:

If

$$P(ham | -i , -am , -not , coming , -good , -work , -do , -you , -need , -viagra , -win , -an , -imac)$$

is greater than

$$P(spam | -i , -am , -not , coming , -good , -work , -do , -you , -need , -viagra , -win , -an , -imac)$$

we say that the message “coming home” is more likely to be a ham message; vice versa.

According to Bayes’ Theorem,

$$P(ham | -i , -am , -not , coming , -good , -work , -do , -you , -need , -viagra , -win , -an , -imac)$$

$Bayes’ Therom \Longrightarrow$

$$\frac{P(ham)P(-i , -am , -not , coming , -good , -work , -do , -you , -need , -viagra , -win , -an , -imac| ham)}{P( -i , -am , -not , coming , -good , -work , -do , -you , -need , -viagra , -win , -an , -imac)}$$

$Total probability law \Longrightarrow$

$$frac
{P(ham)P(-i , -am , -not , coming , -good , -work , -do , -you , -need , -viagra , -win , -an , -imac| ham)}
{P(ham)P(-i , -am , -not , coming , -good , -work , -do , -you , -need , -viagra , -win , -an , -imac| ham)
+P(spam)P(-i , -am , -not , coming , -good , -work , -do , -you , -need , -viagra , -win , -an , -imac| spam)} $$

$naive bayes simplify \Longrightarrow$

$$ \frac{P(ham)[P(-i|ham)P(-am|ham)P(-not|ham)P(coming|ham)P(-good|ham)P(-work|ham) P(-do|ham)P(-you|ham)P(-need|ham)P(-viagra|ham)P(-win|ham)(-an|ham)P(-imac|ham)]}{P(ham)[P(-i|ham)P(-am|ham)P(-not|ham)P(coming|ham)P(-good|ham)P(-work|ham) P(-do|ham)P(-you|ham)P(-need|ham)P(-viagra|ham)P(-win|ham)(-an|ham)P(-imac|ham)]+P(spam)[P(-i|spam)P(-am|spam)P(-not|spam)P(coming|spam)P(-good|spam)P(-work|spam) P(-do|spam)P(-you|spam)P(-need|spam)P(-viagra|spam)P(-win|spam)(-an|spam)P(-imac|spam)]}$$ = $$\frac {0.5*(0.5*0.5*0.5*0.5*0.5*0.5*0.95*0.95*0.95*0.95*0.95*0.95*0.95)}{0.5*(0.5*0.5*0.5*0.5*0.5*0.5*0.95*0.95*0.95*0.95*0.95*0.95*0.95)+0.5*(0.95*0.95*0.95*0.05*0.95*0.95*0.5*0.5*0.5*0.5*0.5*0.5*0.5)}$$ = $$\frac{0.00545576012}{0.00545576012+0.00015112908}=0.97304582369$$ Alternatively, we can focus only on the propensities which are proportional to posterior probability. $$P(ham | -i , -am , -not , coming , -good , -work , -do , -you , -need , -viagra , -win , -an , -imac)$$ $\propto$ $$P(ham)[P(-i|ham)P(-am|ham)P(-not|ham)P(coming|ham)P(-good|ham)P(-work|ham) P(-do|ham)P(-you|ham)P(-need|ham)P(-viagra|ham)P(-win|ham)(-an|ham)P(-imac|ham)]$$ = $$ 0.5*(0.5*0.5*0.5*0.5*0.5*0.5*0.95*0.95*0.95*0.95*0.95*0.95*0.95)$$ = $$0.00545576012$$ Similarly, According to Bayes’ Theorem, $$P(spam | -i , -am , -not , coming , -good , -work , -do , -you , -need , -viagra , -win , -an , -imac)$$ $Bayes’ Therom \Longrightarrow$ $$\frac{P(spam)P(-i , -am , -not , coming , -good , -work , -do , -you , -need , -viagra , -win , -an , -imac| spam)}{P( -i , -am , -not , coming , -good , -work , -do , -you , -need , -viagra , -win , -an , -imac)}$$ $Total probability law \Longrightarrow$ $$\frac {P(spam)P(-i , -am , -not , coming , -good , -work , -do , -you , -need , -viagra , -win , -an , -imac| spam)} {P(ham)P(-i , -am , -not , coming , -good , -work , -do , -you , -need , -viagra , -win , -an , -imac| ham) +P(spam)P(-i , -am , -not , coming , -good , -work , -do , -you , -need , -viagra , -win , -an , -imac| spam)} $$ $naive bayes simplify \Longrightarrow$ $$\frac{P(spam)[P(-i|spam)P(-am|spam)P(-not|spam)P(coming|spam)P(-good|spam)P(-work|spam) P(-do|spam)P(-you|spam)P(-need|spam)P(-viagra|spam)P(-win|spam)(-an|spam)P(-imac|spam)]}{P(ham)[P(-i|ham)P(-am|ham)P(-not|ham)P(coming|ham)P(-good|ham)P(-work|ham) P(-do|ham)P(-you|ham)P(-need|ham)P(-viagra|ham)P(-win|ham)(-an|ham)P(-imac|ham)]+P(spam)[P(-i|spam)P(-am|spam)P(-not|spam)P(coming|spam)P(-good|spam)P(-work|spam) P(-do|spam)P(-you|spam)P(-need|spam)P(-viagra|spam)P(-win|spam)(-an|spam)P(-imac|spam)]}$$ = $$\frac{0.5*(0.95*0.95*0.95*0.05*0.95*0.95*0.5*0.5*0.5*0.5*0.5*0.5*0.5)}{0.5*(0.5*0.5*0.5*0.5*0.5*0.5*0.95*0.95*0.95*0.95*0.95*0.95*0.95)+0.5*(0.95*0.95*0.95*0.05*0.95*0.95*0.5*0.5*0.5*0.5*0.5*0.5*0.5)} $$

=

$$\frac{0.00015112908}{0.00545576012+0.00015112908}=0.0269541763$$

Alternatively, we can focus only on the propensities which are proportional to posterior probability.

$$P(spam | -i , -am , -not , coming , -good , -work , -do , -you , -need , -viagra , -win , -an , -imac)$$

$\propto$

$$P(spam)[P(-i|spam)P(-am|spam)P(-not|spam)P(coming|spam)P(-good|spam)P(-work|spam)
P(-do|spam)P(-you|spam)P(-need|spam)P(-viagra|spam)P(-win|spam)(-an|spam)P(-imac|spam)]$$

=

$$ 0.5*(0.95*0.95*0.95*0.05*0.95*0.95*0.5*0.5*0.5*0.5*0.5*0.5*0.5)$$

=

$$0.00015112908$$

Since probability 0.00545576012 > 0.00015112908, or propensity 0.97304582369 > 0.0269541763, we conclude that the message “coming home” is more likely to be a ham message.

2.3.2 For message”Get Viagra now”:

$$P(ham | -i , -am , -not , -coming , -good , -work , -do , -you , -need , viagra , -win , -an , -imac)$$

$\propto$

$$P(ham)[P(-i|ham)P(-am|ham)P(-not|ham)P(-coming|ham)P(-good|ham)P(-work|ham)
P(-do|ham)P(-you|ham)P(-need|ham)P(viagra|ham)P(-win|ham)(-an|ham)P(-imac|ham)]$$

=

$$0.5*(0.5*0.5*0.5*0.5*0.5*0.5*0.95*0.95*0.95*0.05*0.95*0.95*0.95)=0.00028714526$$

$$P(spam | -i , -am , -not , -coming , -good , -work , -do , -you , -need , viagra , -win , -an , -imac) $$

$\propto$

$$P(spam)[P(-i|spam)P(-am|spam)P(-not|spam)P(-coming|spam)P(-good|spam)P(-work|spam)
P(-do|spam)P(-you|spam)P(-need|spam)P(viagra|spam)P(-win|spam)(-an|spam)P(-imac|spam)] $$

=

$$0.5*(0.95*0.95*0.95*0.95*0.95*0.95*0.5*0.5*0.5*0.5*0.5*0.5*0.5)=0.00287145269$$

To calculate the probability of we divide 0.00028714526 and 0.00287145269 by (0.00028714526+0.00287145269), respectively, and the probability is 0.09090908831 and 0.90909091168, which again suggested “Get Viagra now” is more likely to be a spam message.

Since the propensity 0.00028714526 < 0.00287145269, or the probability 0.09090908831< 0.90909091168, we believe that the message “Get Viagra now” is more likely to be a spam message.

    原文作者:陈冠东
    原文地址: https://segmentfault.com/a/1190000012331865
    本文转自网络文章,转载此文章仅为分享知识,如有侵权,请联系博主进行删除。
点赞