Example calculation of entropy, Computer Engineering

Assignment Help:

Example Calculation:

If we see an example we are working with a set of examples like S = {s1,s2,s3,s4} categorised with a binary categorisation of positives and negatives like that s1  is positive and the rest are negative. Expect further there that we want to calculate the information gain of an attribute, A, and  A can take the values {v1,v2,v3} obviously. So lat in finally assume that as: 

1745_Example Calculation of Entropy.png

Whether to work out the information gain for A relative to S but we first use to calculate the entropy of S. Means that to use our formula for binary categorisations that we use to know the proportion of positives in S and the proportion of negatives. Thus these are given such as: p+ = 1/4 and p- = 3/4. So then we can calculate as: 

Entropy(S) = -(1/4)log2(1/4) -(3/4)log2(3/4) = -(1/4)(-2) -(3/4)(-0.415) = 0.5 + 0.311

= 0.811 

Now next here instantly note that there to do this calculation into your calculator that you may need to remember that as: log2(x) = ln(x)/ln(2), when ln(2) is the natural log of 2. Next, we need to calculate the weighted Entropy(Sv) for each value v = v1, v2, v3, v4, noting that the weighting involves multiplying by (|Svi|/|S|). Remember also that Sv  is the set of examples from S which have value v for attribute A. This means that:  Sv1 = {s4}, sv2={s1, s2}, sv3 = {s3}. 

We now have need to carry out these calculations: 

(|Sv1|/|S|) * Entropy(Sv1) = (1/4) * (-(0/1)log2(0/1) - (1/1)log2(1/1)) = (1/4)(-0 -

(1)log2(1)) = (1/4)(-0 -0) = 0 

(|Sv2|/|S|) * Entropy(Sv2) = (2/4) * (-(1/2)log2(1/2) - (1/2)log2(1/2))

                                      = (1/2) * (-(1/2)*(-1) - (1/2)*(-1)) = (1/2) * (1) = 1/2 

(|Sv3|/|S|) * Entropy(Sv3) = (1/4) * (-(0/1)log2(0/1) - (1/1)log2(1/1)) = (1/4)(-0 -

(1)log2(1)) = (1/4)(-0 -0) = 0 

Note that we have taken 0 log2(0) to be zero, which is standard. In our calculation,

we only required log2(1) = 0 and log2(1/2) =  -1. We now have to add these three values together and take the result from our calculation for Entropy(S) to give us the final result: 

Gain(S,A) = 0.811 - (0 + 1/2 + 0) = 0.311 

Now we look at how information gain can be utilising in practice in an algorithm to construct decision trees.


Related Discussions:- Example calculation of entropy

Which one logic gate can generate any logic function, A universal logic gat...

A universal logic gate is one, which can be used to generate any logic function.  Which one is a universal logic gate? Ans. NAND gate is a universal logic gate; it can generate

What is the dissimilarity between a lens and a mirror, Q. What is the dissi...

Q. What is the dissimilarity between a lens and a mirror? Answer:- A mirror is a reflective surface that light passes through the glass and hits the silver backing reflect

Subtraction 11011-11001 using 2's complement, Subtraction 11011-11001 using...

Subtraction 11011-11001 using 2's complement. Ans. 11011 - 11001 = A - B 2's complement of B = 00111 1 1 0 1 1 + 0 0 1 1 1 1 0 0 0 1 0 Ignore carry to get answer as 00010 = 2.

What are the different scheduling policies in linux, What are the different...

What are the different scheduling policies in Linux The Linux scheduler has three different scheduling policies: one for 'normal'Processes, and two for 'real time' processes

Show the hypothetical instruction format of 32 bits, Q. Show the Hypothetic...

Q. Show the Hypothetical Instruction Format of 32 bits? A sample instruction format is shown in figure below. Figure: A Hypothetical Instruction Format of 32 bits

Computer Fundamentals, state and explain the advantages of having densely ...

state and explain the advantages of having densely packed integrated Circuits in the computer

What is compact disk rom, Q. What is Compact Disk ROM? Both audio CD an...

Q. What is Compact Disk ROM? Both audio CD and CD-ROM (compact disk read-only memory) share similar technology. Main difference is that CD-ROM players are more rugged and have

Describe the von neumann architecture, Describe the VON NEUMANN ARCHITECTUR...

Describe the VON NEUMANN ARCHITECTURE Most  of  present  computer  designs  are  based  on  idea  developed  by  John  vonNeumann referred to as the VON NEUMANN ARCHITECTURE. V

Move the layout table, You can select and move a layout table to other area...

You can select and move a layout table to other areas in a particular document. You can't, though, move a layout table so that it overlaps another. Next you will move the table

System for an online furniture shop, As an XML expert you are needed to mod...

As an XML expert you are needed to model a system for an online furniture shop. After an interview with the shop manager you have the certain information: The detail of th

Write Your Message!

Captcha
Free Assignment Quote

Assured A++ Grade

Get guaranteed satisfaction & time on delivery in every assignment order you paid with us! We ensure premium quality solution document along with free turntin report!

All rights reserved! Copyrights ©2019-2020 ExpertsMind IT Educational Pvt Ltd