Example calculation of entropy, Computer Engineering

Assignment Help:

Example Calculation:

If we see an example we are working with a set of examples like S = {s1,s2,s3,s4} categorised with a binary categorisation of positives and negatives like that s1  is positive and the rest are negative. Expect further there that we want to calculate the information gain of an attribute, A, and  A can take the values {v1,v2,v3} obviously. So lat in finally assume that as: 

1745_Example Calculation of Entropy.png

Whether to work out the information gain for A relative to S but we first use to calculate the entropy of S. Means that to use our formula for binary categorisations that we use to know the proportion of positives in S and the proportion of negatives. Thus these are given such as: p+ = 1/4 and p- = 3/4. So then we can calculate as: 

Entropy(S) = -(1/4)log2(1/4) -(3/4)log2(3/4) = -(1/4)(-2) -(3/4)(-0.415) = 0.5 + 0.311

= 0.811 

Now next here instantly note that there to do this calculation into your calculator that you may need to remember that as: log2(x) = ln(x)/ln(2), when ln(2) is the natural log of 2. Next, we need to calculate the weighted Entropy(Sv) for each value v = v1, v2, v3, v4, noting that the weighting involves multiplying by (|Svi|/|S|). Remember also that Sv  is the set of examples from S which have value v for attribute A. This means that:  Sv1 = {s4}, sv2={s1, s2}, sv3 = {s3}. 

We now have need to carry out these calculations: 

(|Sv1|/|S|) * Entropy(Sv1) = (1/4) * (-(0/1)log2(0/1) - (1/1)log2(1/1)) = (1/4)(-0 -

(1)log2(1)) = (1/4)(-0 -0) = 0 

(|Sv2|/|S|) * Entropy(Sv2) = (2/4) * (-(1/2)log2(1/2) - (1/2)log2(1/2))

                                      = (1/2) * (-(1/2)*(-1) - (1/2)*(-1)) = (1/2) * (1) = 1/2 

(|Sv3|/|S|) * Entropy(Sv3) = (1/4) * (-(0/1)log2(0/1) - (1/1)log2(1/1)) = (1/4)(-0 -

(1)log2(1)) = (1/4)(-0 -0) = 0 

Note that we have taken 0 log2(0) to be zero, which is standard. In our calculation,

we only required log2(1) = 0 and log2(1/2) =  -1. We now have to add these three values together and take the result from our calculation for Entropy(S) to give us the final result: 

Gain(S,A) = 0.811 - (0 + 1/2 + 0) = 0.311 

Now we look at how information gain can be utilising in practice in an algorithm to construct decision trees.


Related Discussions:- Example calculation of entropy

What are the fundamental steps in program development, What are the fundame...

What are the fundamental steps in program development The basic steps in program development are as follows: a. Program coding, design and documentation. b. Preparation o

Design a multiplier, how can we design a multiplier by using ASM chart and ...

how can we design a multiplier by using ASM chart and then design the data controller ?!!

Which instruction is use in each process before proceeding, Before proceedi...

Before proceeding with its execution, each process must acquire all the resources it needs is called ? Ans. Hold and Wait is requires in each process should acquire all the res

Enumerate about the specialised hardware, Enumerate about the Specialised h...

Enumerate about the Specialised hardware Specialised hardware such as protected memory or cryptographic memory module for storing and protecting the keys proves to be a good s

What is literal, Meaning of Literal is:- Literal is string constant.

Meaning of Literal is:- Literal is string constant.

Concept of multithreading, Concept of Multithreading: These troubles incre...

Concept of Multithreading: These troubles increase in the design of large-scale multiprocessors such as MPP as discussed above. Thus, a solution for optimizing this latency should

Illustrate high performance fortran, In 1993 High Performance FORTRAN Forum...

In 1993 High Performance FORTRAN Forum which is a group of academicians and many leading software and hardware vendors in field of parallel processing established an informal langu

Associativity of connectives - artificial intelligence, Associativity of Co...

Associativity of Connectives: In order to tell us brackets are useful when to perform calculations in arithmetic and when to evaluate the truth of sentences in logic. Imagine w

Drawback of indirect addressing, Q. Drawback of indirect addressing? • ...

Q. Drawback of indirect addressing? • Drawback of this scheme is that it needs two memory references to fetch actual operand. First memory reference is to fetch the actual addr

What are inertial and non-inertial frame of references, Q. What are inertia...

Q. What are inertial and non-inertial frame of references? (i) Inertial (or) unaccelerated frames: Bodies in this frame follow Newton's law of intertia as well as othe

Write Your Message!

Captcha
Free Assignment Quote

Assured A++ Grade

Get guaranteed satisfaction & time on delivery in every assignment order you paid with us! We ensure premium quality solution document along with free turntin report!

All rights reserved! Copyrights ©2019-2020 ExpertsMind IT Educational Pvt Ltd