Example calculation of entropy, Computer Engineering

Assignment Help:

Example Calculation:

If we see an example we are working with a set of examples like S = {s1,s2,s3,s4} categorised with a binary categorisation of positives and negatives like that s1  is positive and the rest are negative. Expect further there that we want to calculate the information gain of an attribute, A, and  A can take the values {v1,v2,v3} obviously. So lat in finally assume that as: 

1745_Example Calculation of Entropy.png

Whether to work out the information gain for A relative to S but we first use to calculate the entropy of S. Means that to use our formula for binary categorisations that we use to know the proportion of positives in S and the proportion of negatives. Thus these are given such as: p+ = 1/4 and p- = 3/4. So then we can calculate as: 

Entropy(S) = -(1/4)log2(1/4) -(3/4)log2(3/4) = -(1/4)(-2) -(3/4)(-0.415) = 0.5 + 0.311

= 0.811 

Now next here instantly note that there to do this calculation into your calculator that you may need to remember that as: log2(x) = ln(x)/ln(2), when ln(2) is the natural log of 2. Next, we need to calculate the weighted Entropy(Sv) for each value v = v1, v2, v3, v4, noting that the weighting involves multiplying by (|Svi|/|S|). Remember also that Sv  is the set of examples from S which have value v for attribute A. This means that:  Sv1 = {s4}, sv2={s1, s2}, sv3 = {s3}. 

We now have need to carry out these calculations: 

(|Sv1|/|S|) * Entropy(Sv1) = (1/4) * (-(0/1)log2(0/1) - (1/1)log2(1/1)) = (1/4)(-0 -

(1)log2(1)) = (1/4)(-0 -0) = 0 

(|Sv2|/|S|) * Entropy(Sv2) = (2/4) * (-(1/2)log2(1/2) - (1/2)log2(1/2))

                                      = (1/2) * (-(1/2)*(-1) - (1/2)*(-1)) = (1/2) * (1) = 1/2 

(|Sv3|/|S|) * Entropy(Sv3) = (1/4) * (-(0/1)log2(0/1) - (1/1)log2(1/1)) = (1/4)(-0 -

(1)log2(1)) = (1/4)(-0 -0) = 0 

Note that we have taken 0 log2(0) to be zero, which is standard. In our calculation,

we only required log2(1) = 0 and log2(1/2) =  -1. We now have to add these three values together and take the result from our calculation for Entropy(S) to give us the final result: 

Gain(S,A) = 0.811 - (0 + 1/2 + 0) = 0.311 

Now we look at how information gain can be utilising in practice in an algorithm to construct decision trees.


Related Discussions:- Example calculation of entropy

Show two way pipelined timing, Q. Show Two Way Pipelined Timing? Figure...

Q. Show Two Way Pipelined Timing? Figure below demonstrates a simple pipelining scheme in which F and E stages of two different instructions are performed concurrently. This sc

Why a task cannot return a value, Why a task can n ot return a value? ...

Why a task can n ot return a value? If tasks can return values then Let's take a look at the below example. A=f1(B)+f2(C); and f1 and f2 had delays of say 5 and 10? Whe

What is check box, A dialog box, generally square, that records an on or of...

A dialog box, generally square, that records an on or off value.

Trusted publisher, State the criteria which a trusted publisher should meet...

State the criteria which a trusted publisher should meet before adding him? Ans) The following criteria should be met by the publisher before adding him to the list. ? The

NachoS, Can I get the scheduling code for NachOS?

Can I get the scheduling code for NachOS?

Explain tabulation method for simplifying k maps, Explain Tabulation Method...

Explain Tabulation Method for simplifying k maps? The Tabulation Method (QUINE-McCLUSKEY MINIMIZATION) An expression is represented in the canonical SOP form if not

State the Process of sending signals to televisions, Determine about the Si...

Determine about the Signals to televisions Signals to televisions are now sent digitally thus need a computer to interpret as well as decode these signals into a sound/picture;

Id3 algorithm - artificial intelligence, Th e ID3 algorithm The cal...

Th e ID3 algorithm The calculation for information gain is the very difficult phase of this algorithm. ID3 performs a search whereby the search states are decision trees an

Explain macros and macro processors, System Software 1. Explain MASM? E...

System Software 1. Explain MASM? Explain its features. 2. What is the significance of Lexical analysis and Syntax analysis? 3. Explain macros and macro processors? Explai

What is linear addressing mode, What is Linear Addressing Mode. Ans. L...

What is Linear Addressing Mode. Ans. Linear Addressing: Addressing is the procedure of selecting one of the cells in a memory to be written in or to be read from. So as to fa

Write Your Message!

Captcha
Free Assignment Quote

Assured A++ Grade

Get guaranteed satisfaction & time on delivery in every assignment order you paid with us! We ensure premium quality solution document along with free turntin report!

All rights reserved! Copyrights ©2019-2020 ExpertsMind IT Educational Pvt Ltd