Example calculation of entropy, Computer Engineering

Assignment Help:

Example Calculation:

If we see an example we are working with a set of examples like S = {s1,s2,s3,s4} categorised with a binary categorisation of positives and negatives like that s1  is positive and the rest are negative. Expect further there that we want to calculate the information gain of an attribute, A, and  A can take the values {v1,v2,v3} obviously. So lat in finally assume that as: 

1745_Example Calculation of Entropy.png

Whether to work out the information gain for A relative to S but we first use to calculate the entropy of S. Means that to use our formula for binary categorisations that we use to know the proportion of positives in S and the proportion of negatives. Thus these are given such as: p+ = 1/4 and p- = 3/4. So then we can calculate as: 

Entropy(S) = -(1/4)log2(1/4) -(3/4)log2(3/4) = -(1/4)(-2) -(3/4)(-0.415) = 0.5 + 0.311

= 0.811 

Now next here instantly note that there to do this calculation into your calculator that you may need to remember that as: log2(x) = ln(x)/ln(2), when ln(2) is the natural log of 2. Next, we need to calculate the weighted Entropy(Sv) for each value v = v1, v2, v3, v4, noting that the weighting involves multiplying by (|Svi|/|S|). Remember also that Sv  is the set of examples from S which have value v for attribute A. This means that:  Sv1 = {s4}, sv2={s1, s2}, sv3 = {s3}. 

We now have need to carry out these calculations: 

(|Sv1|/|S|) * Entropy(Sv1) = (1/4) * (-(0/1)log2(0/1) - (1/1)log2(1/1)) = (1/4)(-0 -

(1)log2(1)) = (1/4)(-0 -0) = 0 

(|Sv2|/|S|) * Entropy(Sv2) = (2/4) * (-(1/2)log2(1/2) - (1/2)log2(1/2))

                                      = (1/2) * (-(1/2)*(-1) - (1/2)*(-1)) = (1/2) * (1) = 1/2 

(|Sv3|/|S|) * Entropy(Sv3) = (1/4) * (-(0/1)log2(0/1) - (1/1)log2(1/1)) = (1/4)(-0 -

(1)log2(1)) = (1/4)(-0 -0) = 0 

Note that we have taken 0 log2(0) to be zero, which is standard. In our calculation,

we only required log2(1) = 0 and log2(1/2) =  -1. We now have to add these three values together and take the result from our calculation for Entropy(S) to give us the final result: 

Gain(S,A) = 0.811 - (0 + 1/2 + 0) = 0.311 

Now we look at how information gain can be utilising in practice in an algorithm to construct decision trees.


Related Discussions:- Example calculation of entropy

Explain about system deadlock, Q. Explain about System Deadlock? A dead...

Q. Explain about System Deadlock? A deadlock denotes to the condition when simultaneous processes are holding resources and putting off each other from finishing their executio

Define the heat transfer processes, Heat Transfer Coursework An interna...

Heat Transfer Coursework An internal combustion engine of a passenger car is operating at steady state conditions e.g. constant speed (r.p.m.) and load (torque), so the engine

What is mmu, What is MMU? MMU is the Memory Management Unit. It is a sp...

What is MMU? MMU is the Memory Management Unit. It is a special memory control circuit used for executing the mapping of the virtual address space onto the physical memory.

What is simd, What is SIMD? Single Instruction stream, Multiple Data st...

What is SIMD? Single Instruction stream, Multiple Data stream (SIMD) shows an organization that contains many processing units under the supervision of a common control unit. A

Explain jk flip-flop using sr flip-flop, Q. Explain 4 bit Ripple counter wi...

Q. Explain 4 bit Ripple counter with necessary diagram. Q. Explain JK Master-slave Flip-flop with block diagram and logic design. Q. Explain JK flip-flop using SR flip-flop

Avoiding local minima, Avoiding Local Minima: However the error rate o...

Avoiding Local Minima: However the error rate of multi-layered networks over a training set could be calculated as the number of mis-classified examples. So always keep in rem

Define side tone is essential in telephone communication, A certain amount ...

A certain amount of side tone is essential in telephone communication. Ans: It is true that a specific amount of side tone is necessary in telephone communication.

Write your array of text into file, Part I: 1. The program starts by prin...

Part I: 1. The program starts by printing your initial with an end sign ">". For example, "cjx >"; 2. Then, you can type in the following "vi filename". For example, "vi myp.c

What is shadow mask, Q. What is Shadow Mask? Shadow Mask is a metal she...

Q. What is Shadow Mask? Shadow Mask is a metal sheet that has fine perforations (holes) in it and is situated a short distance before phosphor coated screen. The Phosphor dots

Grapgh, representation of the adjacency matrix and adjacency list

representation of the adjacency matrix and adjacency list

Write Your Message!

Captcha
Free Assignment Quote

Assured A++ Grade

Get guaranteed satisfaction & time on delivery in every assignment order you paid with us! We ensure premium quality solution document along with free turntin report!

All rights reserved! Copyrights ©2019-2020 ExpertsMind IT Educational Pvt Ltd