Example calculation of entropy, Computer Engineering

Assignment Help:

Example Calculation:

If we see an example we are working with a set of examples like S = {s1,s2,s3,s4} categorised with a binary categorisation of positives and negatives like that s1  is positive and the rest are negative. Expect further there that we want to calculate the information gain of an attribute, A, and  A can take the values {v1,v2,v3} obviously. So lat in finally assume that as: 

1745_Example Calculation of Entropy.png

Whether to work out the information gain for A relative to S but we first use to calculate the entropy of S. Means that to use our formula for binary categorisations that we use to know the proportion of positives in S and the proportion of negatives. Thus these are given such as: p+ = 1/4 and p- = 3/4. So then we can calculate as: 

Entropy(S) = -(1/4)log2(1/4) -(3/4)log2(3/4) = -(1/4)(-2) -(3/4)(-0.415) = 0.5 + 0.311

= 0.811 

Now next here instantly note that there to do this calculation into your calculator that you may need to remember that as: log2(x) = ln(x)/ln(2), when ln(2) is the natural log of 2. Next, we need to calculate the weighted Entropy(Sv) for each value v = v1, v2, v3, v4, noting that the weighting involves multiplying by (|Svi|/|S|). Remember also that Sv  is the set of examples from S which have value v for attribute A. This means that:  Sv1 = {s4}, sv2={s1, s2}, sv3 = {s3}. 

We now have need to carry out these calculations: 

(|Sv1|/|S|) * Entropy(Sv1) = (1/4) * (-(0/1)log2(0/1) - (1/1)log2(1/1)) = (1/4)(-0 -

(1)log2(1)) = (1/4)(-0 -0) = 0 

(|Sv2|/|S|) * Entropy(Sv2) = (2/4) * (-(1/2)log2(1/2) - (1/2)log2(1/2))

                                      = (1/2) * (-(1/2)*(-1) - (1/2)*(-1)) = (1/2) * (1) = 1/2 

(|Sv3|/|S|) * Entropy(Sv3) = (1/4) * (-(0/1)log2(0/1) - (1/1)log2(1/1)) = (1/4)(-0 -

(1)log2(1)) = (1/4)(-0 -0) = 0 

Note that we have taken 0 log2(0) to be zero, which is standard. In our calculation,

we only required log2(1) = 0 and log2(1/2) =  -1. We now have to add these three values together and take the result from our calculation for Entropy(S) to give us the final result: 

Gain(S,A) = 0.811 - (0 + 1/2 + 0) = 0.311 

Now we look at how information gain can be utilising in practice in an algorithm to construct decision trees.


Related Discussions:- Example calculation of entropy

Explain about interrupt servicing routines, Q. Explain about Interrupt serv...

Q. Explain about Interrupt servicing routines? First the situation is to be checked as to why the interrupt has happened. That includes not only device but also why the device

Difference between global and local variables, Global and Local Variables ...

Global and Local Variables Global variables: The features are as pursue Declared outside of all functions or before main. These can be used in all the functions in the progra

Execute sparse matrix dynamically, To execute sparse matrix dynamically, Wh...

To execute sparse matrix dynamically, Which data structure is used Linked List data structure is used to execute sparse matrix dynamically

What is the future in mq if i have 2+exp, It speeds execution of distribute...

It speeds execution of distributed applications. It runs on dissimilar platforms. It time independent. No loss for msg delivery i.e. guarantee delivery.

What is .net and .net framework, What is .NET / .NET Framework?  It i...

What is .NET / .NET Framework?  It is a Framework in which Windows applications might be developed and run. The Microsoft .NET Framework is a platform for building, deploying,

Expalin johnson counter, Write short note on the Johnson counter. Ans:...

Write short note on the Johnson counter. Ans:  Johnson Counter: It is a synchronous counter, where all flip-flops are clocked concurrently and the clock pulses drive the

How many methods used to control traffic flowing in-out, How many methods u...

How many methods used to control traffic flowing into and out of the network by firewall? Firewalls utilize one or more of three ways to control traffic flowing into and out of

Java, mine sweeper algorithm.

mine sweeper algorithm.

Benefits of traditional hard disks and cd-rom, Q. Benefits of traditional h...

Q. Benefits of traditional hard disks and CD-ROM? CD-ROM is suitable for distribution of large amounts of data to a large number of users. CD-ROMs are a general medium these da

QUELING SYTEM , SHOW THAT AVERAGE NUMBER OF UNIT IN A (M/M/1) QUELING SYTEM...

SHOW THAT AVERAGE NUMBER OF UNIT IN A (M/M/1) QUELING SYTEM IS EQUAL TO P/(1-p). NOTE:P=ROW

Write Your Message!

Captcha
Free Assignment Quote

Assured A++ Grade

Get guaranteed satisfaction & time on delivery in every assignment order you paid with us! We ensure premium quality solution document along with free turntin report!

All rights reserved! Copyrights ©2019-2020 ExpertsMind IT Educational Pvt Ltd