Example calculation of entropy, Computer Engineering

Assignment Help:

Example Calculation:

If we see an example we are working with a set of examples like S = {s1,s2,s3,s4} categorised with a binary categorisation of positives and negatives like that s1  is positive and the rest are negative. Expect further there that we want to calculate the information gain of an attribute, A, and  A can take the values {v1,v2,v3} obviously. So lat in finally assume that as: 

1745_Example Calculation of Entropy.png

Whether to work out the information gain for A relative to S but we first use to calculate the entropy of S. Means that to use our formula for binary categorisations that we use to know the proportion of positives in S and the proportion of negatives. Thus these are given such as: p+ = 1/4 and p- = 3/4. So then we can calculate as: 

Entropy(S) = -(1/4)log2(1/4) -(3/4)log2(3/4) = -(1/4)(-2) -(3/4)(-0.415) = 0.5 + 0.311

= 0.811 

Now next here instantly note that there to do this calculation into your calculator that you may need to remember that as: log2(x) = ln(x)/ln(2), when ln(2) is the natural log of 2. Next, we need to calculate the weighted Entropy(Sv) for each value v = v1, v2, v3, v4, noting that the weighting involves multiplying by (|Svi|/|S|). Remember also that Sv  is the set of examples from S which have value v for attribute A. This means that:  Sv1 = {s4}, sv2={s1, s2}, sv3 = {s3}. 

We now have need to carry out these calculations: 

(|Sv1|/|S|) * Entropy(Sv1) = (1/4) * (-(0/1)log2(0/1) - (1/1)log2(1/1)) = (1/4)(-0 -

(1)log2(1)) = (1/4)(-0 -0) = 0 

(|Sv2|/|S|) * Entropy(Sv2) = (2/4) * (-(1/2)log2(1/2) - (1/2)log2(1/2))

                                      = (1/2) * (-(1/2)*(-1) - (1/2)*(-1)) = (1/2) * (1) = 1/2 

(|Sv3|/|S|) * Entropy(Sv3) = (1/4) * (-(0/1)log2(0/1) - (1/1)log2(1/1)) = (1/4)(-0 -

(1)log2(1)) = (1/4)(-0 -0) = 0 

Note that we have taken 0 log2(0) to be zero, which is standard. In our calculation,

we only required log2(1) = 0 and log2(1/2) =  -1. We now have to add these three values together and take the result from our calculation for Entropy(S) to give us the final result: 

Gain(S,A) = 0.811 - (0 + 1/2 + 0) = 0.311 

Now we look at how information gain can be utilising in practice in an algorithm to construct decision trees.


Related Discussions:- Example calculation of entropy

Define interrupts and instruction cycle, Q. Define Interrupts and Instructi...

Q. Define Interrupts and Instruction Cycle? Let's precise the interrupt process, on the event of an interrupt, an interrupt request (in form of a signal) is concerned to CPU. T

What is XML DTD (Document Type Definition), What is XML DTD (Document Type ...

What is XML DTD (Document Type Definition)? DTD is a document which defines legal building blocks of a particular XML document. This defines the document structure along with

System software, define the properties of interactive operating systems

define the properties of interactive operating systems

Find 10s complement for decimal number, Q. Find 10's complement for decimal...

Q. Find 10's complement for decimal number? Adding 1 in 9's complement produces 10's complement. 10's complement of 0256 = 9743+1 = 9744. Please note on adding the number in

Recursive binary search, The implementation of a (non-recursive) binary sea...

The implementation of a (non-recursive) binary search of an array. The assumption is that a given array is sorted. We want to see if a particular value, that we'll call the target

Explain 32 to 1 multiplexer tree, Why is Multiplexer Tree needed? Draw ...

Why is Multiplexer Tree needed? Draw the block diagram of a 32:1 Multiplexer Tree and explain how input is directed to the output in this system. Ans. One of the possible

What is a system call, What is a system call? A  system  call  is  a  r...

What is a system call? A  system  call  is  a  request  made  through  any  program  to  the  operating  system  for performing tasks, picked by a predefined set, that the said

Explain use of mpi functions with an example, Q. Explain use of MPI functio...

Q. Explain use of MPI functions with an example? include int main(int argc, char **argv) { int i, tmp, sum, s, r, N, x[100]; MPI_Init(&argc, &argv); MPI_Comm_size

3D rotation, Magnify a triangle with vertices A = (0,0), B = (3,3) and C = ...

Magnify a triangle with vertices A = (0,0), B = (3,3) and C = (6,4) to twice its size in such a way that A remains in its original position.

Explain cause effect graphing, Explain cause effect graphing . Cause-ef...

Explain cause effect graphing . Cause-effect graphing is a test case design method that gives a concise representation of logical conditions and corresponding actions.  The

Write Your Message!

Captcha
Free Assignment Quote

Assured A++ Grade

Get guaranteed satisfaction & time on delivery in every assignment order you paid with us! We ensure premium quality solution document along with free turntin report!

All rights reserved! Copyrights ©2019-2020 ExpertsMind IT Educational Pvt Ltd