Example calculation of entropy, Computer Engineering

Assignment Help:

Example Calculation:

If we see an example we are working with a set of examples like S = {s1,s2,s3,s4} categorised with a binary categorisation of positives and negatives like that s1  is positive and the rest are negative. Expect further there that we want to calculate the information gain of an attribute, A, and  A can take the values {v1,v2,v3} obviously. So lat in finally assume that as: 

1745_Example Calculation of Entropy.png

Whether to work out the information gain for A relative to S but we first use to calculate the entropy of S. Means that to use our formula for binary categorisations that we use to know the proportion of positives in S and the proportion of negatives. Thus these are given such as: p+ = 1/4 and p- = 3/4. So then we can calculate as: 

Entropy(S) = -(1/4)log2(1/4) -(3/4)log2(3/4) = -(1/4)(-2) -(3/4)(-0.415) = 0.5 + 0.311

= 0.811 

Now next here instantly note that there to do this calculation into your calculator that you may need to remember that as: log2(x) = ln(x)/ln(2), when ln(2) is the natural log of 2. Next, we need to calculate the weighted Entropy(Sv) for each value v = v1, v2, v3, v4, noting that the weighting involves multiplying by (|Svi|/|S|). Remember also that Sv  is the set of examples from S which have value v for attribute A. This means that:  Sv1 = {s4}, sv2={s1, s2}, sv3 = {s3}. 

We now have need to carry out these calculations: 

(|Sv1|/|S|) * Entropy(Sv1) = (1/4) * (-(0/1)log2(0/1) - (1/1)log2(1/1)) = (1/4)(-0 -

(1)log2(1)) = (1/4)(-0 -0) = 0 

(|Sv2|/|S|) * Entropy(Sv2) = (2/4) * (-(1/2)log2(1/2) - (1/2)log2(1/2))

                                      = (1/2) * (-(1/2)*(-1) - (1/2)*(-1)) = (1/2) * (1) = 1/2 

(|Sv3|/|S|) * Entropy(Sv3) = (1/4) * (-(0/1)log2(0/1) - (1/1)log2(1/1)) = (1/4)(-0 -

(1)log2(1)) = (1/4)(-0 -0) = 0 

Note that we have taken 0 log2(0) to be zero, which is standard. In our calculation,

we only required log2(1) = 0 and log2(1/2) =  -1. We now have to add these three values together and take the result from our calculation for Entropy(S) to give us the final result: 

Gain(S,A) = 0.811 - (0 + 1/2 + 0) = 0.311 

Now we look at how information gain can be utilising in practice in an algorithm to construct decision trees.


Related Discussions:- Example calculation of entropy

Hardware implementation for signed-magnitude data, Hardware Implementation ...

Hardware Implementation for signed-magnitude data When multiplication  is  implemented  in  digital  computer,  we  change  process lightly. Here, in place of providing registe

Find out the access time of ROM using bipolar transistors, The access time ...

The access time of ROM using bipolar transistors is about ? Ans. About 1 µ sec is the access time of ROM using bipolar transistors.

Define the information system, Define the Information System Growi...

Define the Information System Growing sophistication in products and markets is driving the organisational requirement for increasing amounts of information. This req

Pipe, Your shell should accept and execute the pipe "|"  operator. This wil...

Your shell should accept and execute the pipe "|"  operator. This will look like the following: | The functionality of this operator is to execute command1 and send i

Can you tell me some of system tasks and their purpose, Can you tell me som...

Can you tell me some of system tasks and their purpose? $display, $displayb, $displayh, $displayo, $write, $writeb, $writeh, $writeo. The most useful of these is $display. T

Determine reduced boolean equation and the karnaugh map, Determine reduced ...

Determine reduced Boolean equation and the Karnaugh Map? Illustration : Determine reduced Boolean equation and the Karnaugh Map for the truth table shown below:

Importance of using module-instance parameter, Using Module-Instance Parame...

Using Module-Instance Parameter: Parameter values can be overridden while a module is instantiated. New parameter values are passed at the time of module instantiation. Top-

Difference between data warehousing and data mining, What is the difference...

What is the difference between data warehousing and data mining? The dissimilarity between data warehousing and data mining: Data warehousing seems to the data storage wh

To which one erlang is equal, One Erlang is equal to (A)  3600 CCS ...

One Erlang is equal to (A)  3600 CCS (B)  36 CCS (C) 60 CCS (D)  24 CCS Ans: One Erlang is equivalent to 3600 CCS.

Meaning of convergence regarding e-commerce, What does the term convergence...

What does the term convergence mean regarding E-commerce? Convergence regarding e-commerce: The capability to leverage and integrate the different data sources and proces

Write Your Message!

Captcha
Free Assignment Quote

Assured A++ Grade

Get guaranteed satisfaction & time on delivery in every assignment order you paid with us! We ensure premium quality solution document along with free turntin report!

All rights reserved! Copyrights ©2019-2020 ExpertsMind IT Educational Pvt Ltd