Example calculation of entropy, Computer Engineering

Assignment Help:

Example Calculation:

If we see an example we are working with a set of examples like S = {s1,s2,s3,s4} categorised with a binary categorisation of positives and negatives like that s1  is positive and the rest are negative. Expect further there that we want to calculate the information gain of an attribute, A, and  A can take the values {v1,v2,v3} obviously. So lat in finally assume that as: 

1745_Example Calculation of Entropy.png

Whether to work out the information gain for A relative to S but we first use to calculate the entropy of S. Means that to use our formula for binary categorisations that we use to know the proportion of positives in S and the proportion of negatives. Thus these are given such as: p+ = 1/4 and p- = 3/4. So then we can calculate as: 

Entropy(S) = -(1/4)log2(1/4) -(3/4)log2(3/4) = -(1/4)(-2) -(3/4)(-0.415) = 0.5 + 0.311

= 0.811 

Now next here instantly note that there to do this calculation into your calculator that you may need to remember that as: log2(x) = ln(x)/ln(2), when ln(2) is the natural log of 2. Next, we need to calculate the weighted Entropy(Sv) for each value v = v1, v2, v3, v4, noting that the weighting involves multiplying by (|Svi|/|S|). Remember also that Sv  is the set of examples from S which have value v for attribute A. This means that:  Sv1 = {s4}, sv2={s1, s2}, sv3 = {s3}. 

We now have need to carry out these calculations: 

(|Sv1|/|S|) * Entropy(Sv1) = (1/4) * (-(0/1)log2(0/1) - (1/1)log2(1/1)) = (1/4)(-0 -

(1)log2(1)) = (1/4)(-0 -0) = 0 

(|Sv2|/|S|) * Entropy(Sv2) = (2/4) * (-(1/2)log2(1/2) - (1/2)log2(1/2))

                                      = (1/2) * (-(1/2)*(-1) - (1/2)*(-1)) = (1/2) * (1) = 1/2 

(|Sv3|/|S|) * Entropy(Sv3) = (1/4) * (-(0/1)log2(0/1) - (1/1)log2(1/1)) = (1/4)(-0 -

(1)log2(1)) = (1/4)(-0 -0) = 0 

Note that we have taken 0 log2(0) to be zero, which is standard. In our calculation,

we only required log2(1) = 0 and log2(1/2) =  -1. We now have to add these three values together and take the result from our calculation for Entropy(S) to give us the final result: 

Gain(S,A) = 0.811 - (0 + 1/2 + 0) = 0.311 

Now we look at how information gain can be utilising in practice in an algorithm to construct decision trees.


Related Discussions:- Example calculation of entropy

What is the chief advantage of using virtual packets, What is the chief adv...

What is the chief advantage of using virtual packets instead of frames? The router can't transfer a copy of a frame from one kind of network to another since the frame formats

What is replacement algorithm, What is replacement algorithm? When the ...

What is replacement algorithm? When the cache is full and a memory word that is not in the cache is referenced, the cache control hardware must decide which block should be del

Internet data synchronization, want to know about latest work and research ...

want to know about latest work and research papers on internet data synchronization

What is a table pool, What is a table pool? A table pool (or pool) is ...

What is a table pool? A table pool (or pool) is used to join several logical tables in the ABAP/4 Dictionary.  The definition of a pool having of at least two key fields and a

Disadvantages of pipeline - computer architecture, Disadvantages of pipelin...

Disadvantages of pipeline: Pipeline architecture has 2 major disadvantages.  First is its complexity and second is the inability to constantly run the pipeline at full speed,

Define e-cash, What do you understand by E-cash? E-Cash Ecash is ...

What do you understand by E-cash? E-Cash Ecash is a cash which is showed by two models. One is the on-line form of e-cash which permits for the completion of all types of

Decode the code, how to write mobile keypad program in c++

how to write mobile keypad program in c++

What is the draw back of access-3 code, Q. What is the draw back of Access-...

Q. What is the draw back of Access-3 Code? Give suggestion to over come this drawback.

What is the use of inter process communication, What is the use of inter pr...

What is the use of inter process communication. Inter process communication gives a mechanism to allow the co-operating process to communicate with each other and synchronies t

Write Your Message!

Captcha
Free Assignment Quote

Assured A++ Grade

Get guaranteed satisfaction & time on delivery in every assignment order you paid with us! We ensure premium quality solution document along with free turntin report!

All rights reserved! Copyrights ©2019-2020 ExpertsMind IT Educational Pvt Ltd