Example calculation of entropy, Computer Engineering

Assignment Help:

Example Calculation:

If we see an example we are working with a set of examples like S = {s1,s2,s3,s4} categorised with a binary categorisation of positives and negatives like that s1  is positive and the rest are negative. Expect further there that we want to calculate the information gain of an attribute, A, and  A can take the values {v1,v2,v3} obviously. So lat in finally assume that as: 

1745_Example Calculation of Entropy.png

Whether to work out the information gain for A relative to S but we first use to calculate the entropy of S. Means that to use our formula for binary categorisations that we use to know the proportion of positives in S and the proportion of negatives. Thus these are given such as: p+ = 1/4 and p- = 3/4. So then we can calculate as: 

Entropy(S) = -(1/4)log2(1/4) -(3/4)log2(3/4) = -(1/4)(-2) -(3/4)(-0.415) = 0.5 + 0.311

= 0.811 

Now next here instantly note that there to do this calculation into your calculator that you may need to remember that as: log2(x) = ln(x)/ln(2), when ln(2) is the natural log of 2. Next, we need to calculate the weighted Entropy(Sv) for each value v = v1, v2, v3, v4, noting that the weighting involves multiplying by (|Svi|/|S|). Remember also that Sv  is the set of examples from S which have value v for attribute A. This means that:  Sv1 = {s4}, sv2={s1, s2}, sv3 = {s3}. 

We now have need to carry out these calculations: 

(|Sv1|/|S|) * Entropy(Sv1) = (1/4) * (-(0/1)log2(0/1) - (1/1)log2(1/1)) = (1/4)(-0 -

(1)log2(1)) = (1/4)(-0 -0) = 0 

(|Sv2|/|S|) * Entropy(Sv2) = (2/4) * (-(1/2)log2(1/2) - (1/2)log2(1/2))

                                      = (1/2) * (-(1/2)*(-1) - (1/2)*(-1)) = (1/2) * (1) = 1/2 

(|Sv3|/|S|) * Entropy(Sv3) = (1/4) * (-(0/1)log2(0/1) - (1/1)log2(1/1)) = (1/4)(-0 -

(1)log2(1)) = (1/4)(-0 -0) = 0 

Note that we have taken 0 log2(0) to be zero, which is standard. In our calculation,

we only required log2(1) = 0 and log2(1/2) =  -1. We now have to add these three values together and take the result from our calculation for Entropy(S) to give us the final result: 

Gain(S,A) = 0.811 - (0 + 1/2 + 0) = 0.311 

Now we look at how information gain can be utilising in practice in an algorithm to construct decision trees.


Related Discussions:- Example calculation of entropy

Determine the odd parity bit for f, Q. Determine the odd parity bit for F. ...

Q. Determine the odd parity bit for F. Q. Convert the following from binary to decimal, hexadecimal, BCD and octal. a) 1000000011   b) 0001010110101  c) 1 Q. Convert the f

How many chips will be required in a microprocessor , A microprocessor uses...

A microprocessor uses RAM chips of 1024 × 1 capacity. (i) How many chips will be required and how many address lines will be connected to provide capacity of 1024 bytes. (ii) How

Passing parameters through stack, Q. Passing Parameters through Stack? ...

Q. Passing Parameters through Stack? The best scheme for parameter passing is through stack. It is also a standard scheme for passing parameters when assembly language is inter

Find the boolean expression for boolean algebra, Find the Boolean expressio...

Find the Boolean expression for logic circuit shown in Figure below and reduce it using Boolean algebra. Ans. Y = (AB)' + (A' + B)' = A' + B' + AB' by using Demorgan's Theorem. =

Perform multiplication with showing content of accumulator, Perform multipl...

Perform multiplication with showing the contents of accumulator, B register and Y register during each step. (Accumulator, B, Y are 4-bit registers) B=06 Y=02

For what DMSP stands, For what DMSP stands? DMSP stans here for Dist ri...

For what DMSP stands? DMSP stans here for Dist ributed Mai l system Protocol.

Illustrate working of asynchronous counters, Q. Illustrate working of Async...

Q. Illustrate working of Asynchronous Counters? This is more frequently referred as ripple counter as the change that takes place in order to increment counter ripples through

Prove, state and prove distributive law?

state and prove distributive law?

Pipeline, where can I find detailed explanation of 6 stage and 7 stage pipe...

where can I find detailed explanation of 6 stage and 7 stage pipeline ??

Define pipeline speedup, Define pipeline speedup. S(m)=T(l)/T(m) Whe...

Define pipeline speedup. S(m)=T(l)/T(m) Where T(m) is the execution time for some target workload on an m-stage pipeline. T(l) is the execution time for some workload an

Write Your Message!

Captcha
Free Assignment Quote

Assured A++ Grade

Get guaranteed satisfaction & time on delivery in every assignment order you paid with us! We ensure premium quality solution document along with free turntin report!

All rights reserved! Copyrights ©2019-2020 ExpertsMind IT Educational Pvt Ltd