Example calculation of entropy, Computer Engineering

Assignment Help:

Example Calculation:

If we see an example we are working with a set of examples like S = {s1,s2,s3,s4} categorised with a binary categorisation of positives and negatives like that s1  is positive and the rest are negative. Expect further there that we want to calculate the information gain of an attribute, A, and  A can take the values {v1,v2,v3} obviously. So lat in finally assume that as: 

1745_Example Calculation of Entropy.png

Whether to work out the information gain for A relative to S but we first use to calculate the entropy of S. Means that to use our formula for binary categorisations that we use to know the proportion of positives in S and the proportion of negatives. Thus these are given such as: p+ = 1/4 and p- = 3/4. So then we can calculate as: 

Entropy(S) = -(1/4)log2(1/4) -(3/4)log2(3/4) = -(1/4)(-2) -(3/4)(-0.415) = 0.5 + 0.311

= 0.811 

Now next here instantly note that there to do this calculation into your calculator that you may need to remember that as: log2(x) = ln(x)/ln(2), when ln(2) is the natural log of 2. Next, we need to calculate the weighted Entropy(Sv) for each value v = v1, v2, v3, v4, noting that the weighting involves multiplying by (|Svi|/|S|). Remember also that Sv  is the set of examples from S which have value v for attribute A. This means that:  Sv1 = {s4}, sv2={s1, s2}, sv3 = {s3}. 

We now have need to carry out these calculations: 

(|Sv1|/|S|) * Entropy(Sv1) = (1/4) * (-(0/1)log2(0/1) - (1/1)log2(1/1)) = (1/4)(-0 -

(1)log2(1)) = (1/4)(-0 -0) = 0 

(|Sv2|/|S|) * Entropy(Sv2) = (2/4) * (-(1/2)log2(1/2) - (1/2)log2(1/2))

                                      = (1/2) * (-(1/2)*(-1) - (1/2)*(-1)) = (1/2) * (1) = 1/2 

(|Sv3|/|S|) * Entropy(Sv3) = (1/4) * (-(0/1)log2(0/1) - (1/1)log2(1/1)) = (1/4)(-0 -

(1)log2(1)) = (1/4)(-0 -0) = 0 

Note that we have taken 0 log2(0) to be zero, which is standard. In our calculation,

we only required log2(1) = 0 and log2(1/2) =  -1. We now have to add these three values together and take the result from our calculation for Entropy(S) to give us the final result: 

Gain(S,A) = 0.811 - (0 + 1/2 + 0) = 0.311 

Now we look at how information gain can be utilising in practice in an algorithm to construct decision trees.


Related Discussions:- Example calculation of entropy

Describe in brief the history of e-commerce, Describe in brief the history ...

Describe in brief the history of E-Commerce.  History of E-commerce. E-commerce started before personal computers were prevalent and has grown into a multi-billion d

How are connections managed in orbix connect, The Orbix Connect resource ad...

The Orbix Connect resource adapter is packaged as a standard J2EE Connector Architecture resource adapter archive (RAR) file, corbaconn.rar. The corbaconn.rar file having all the c

Explain bootp (boot strap protocol), Explain BOOTP (Boot Strap Protocol). ...

Explain BOOTP (Boot Strap Protocol). TCP or IP designer observed that several of the configuration steps could be combined in a single step if a server was capable to supply mo

For what CIDR stands, CIDR stands for? CIDR stands here for Classless I...

CIDR stands for? CIDR stands here for Classless Inter Domain Routing.

Multiple program multiple data, Like SPMD, MPMD is actually a "high level" ...

Like SPMD, MPMD is actually a "high level" programming model that can be built upon any combination of the previously mentioned parallel programming models. MPMD applications ty

How future environment changes can be predicted - simulation, How future cl...

How future climate/environment changes can be predicted -  Information over time is fed into a weather/climate model -  based on changes in weather patterns, carries out a s

Difference between shadow and override in programming, Overriding tell us o...

Overriding tell us only the methods, but shadowing tells us the entire element.

How 74147 series ttl can be used as a decimal-to-bcd encoder, Illustrate ho...

Illustrate how 74147 series TTL can be used as a decimal-to-BCD encoder. Ans. Available IC in 74 series is 74147 that is a priority encoder. Such IC has active low inp

Digital electronics, Explain the principle of duality with examples.

Explain the principle of duality with examples.

Write Your Message!

Captcha
Free Assignment Quote

Assured A++ Grade

Get guaranteed satisfaction & time on delivery in every assignment order you paid with us! We ensure premium quality solution document along with free turntin report!

All rights reserved! Copyrights ©2019-2020 ExpertsMind IT Educational Pvt Ltd