Example calculation of entropy, Computer Engineering

Assignment Help:

Example Calculation:

If we see an example we are working with a set of examples like S = {s1,s2,s3,s4} categorised with a binary categorisation of positives and negatives like that s1  is positive and the rest are negative. Expect further there that we want to calculate the information gain of an attribute, A, and  A can take the values {v1,v2,v3} obviously. So lat in finally assume that as: 

1745_Example Calculation of Entropy.png

Whether to work out the information gain for A relative to S but we first use to calculate the entropy of S. Means that to use our formula for binary categorisations that we use to know the proportion of positives in S and the proportion of negatives. Thus these are given such as: p+ = 1/4 and p- = 3/4. So then we can calculate as: 

Entropy(S) = -(1/4)log2(1/4) -(3/4)log2(3/4) = -(1/4)(-2) -(3/4)(-0.415) = 0.5 + 0.311

= 0.811 

Now next here instantly note that there to do this calculation into your calculator that you may need to remember that as: log2(x) = ln(x)/ln(2), when ln(2) is the natural log of 2. Next, we need to calculate the weighted Entropy(Sv) for each value v = v1, v2, v3, v4, noting that the weighting involves multiplying by (|Svi|/|S|). Remember also that Sv  is the set of examples from S which have value v for attribute A. This means that:  Sv1 = {s4}, sv2={s1, s2}, sv3 = {s3}. 

We now have need to carry out these calculations: 

(|Sv1|/|S|) * Entropy(Sv1) = (1/4) * (-(0/1)log2(0/1) - (1/1)log2(1/1)) = (1/4)(-0 -

(1)log2(1)) = (1/4)(-0 -0) = 0 

(|Sv2|/|S|) * Entropy(Sv2) = (2/4) * (-(1/2)log2(1/2) - (1/2)log2(1/2))

                                      = (1/2) * (-(1/2)*(-1) - (1/2)*(-1)) = (1/2) * (1) = 1/2 

(|Sv3|/|S|) * Entropy(Sv3) = (1/4) * (-(0/1)log2(0/1) - (1/1)log2(1/1)) = (1/4)(-0 -

(1)log2(1)) = (1/4)(-0 -0) = 0 

Note that we have taken 0 log2(0) to be zero, which is standard. In our calculation,

we only required log2(1) = 0 and log2(1/2) =  -1. We now have to add these three values together and take the result from our calculation for Entropy(S) to give us the final result: 

Gain(S,A) = 0.811 - (0 + 1/2 + 0) = 0.311 

Now we look at how information gain can be utilising in practice in an algorithm to construct decision trees.


Related Discussions:- Example calculation of entropy

Define the system analysis of object oriented modelling, Define the System ...

Define the System Analysis of Object oriented modelling System Analysis: In this stage a statement of the queries is formulated and a model is designed by the analyst in enc

State the term- $display and $write, State the term- $display and $write ...

State the term- $display and $write $display and $write two are the same except which $display always prints a newline character at the end of its execution.

Explain about behavioral notations, Explain about Behavioral Notations ...

Explain about Behavioral Notations These notations contain dynamic elements of the model.  Their elements comprise interaction and the state machine. It also comprise classe

Various threats posed by server in client server environment, Explain vario...

Explain various threats posed through servers into a client server environment. Server Destroyed within an Accident: Power failures, Leaking pipes and equipment failures are no

What is the main reason to encrypt a file, The main reason to encrypt a fil...

The main reason to encrypt a file is to ? Ans. The main purpose to encrypt a file is to secure that for transmission.

Explain about wide area network, Q. Explain about Wide Area Network? Wi...

Q. Explain about Wide Area Network? Wide Area Network (WAN) usually refers to a network, which covers a large geographical area as well as use communications subnets (circuits)

Explain derived equations and models, Each student will be assigned a binar...

Each student will be assigned a binary system. The experimental references and the conditions are indicated in the table below. The student should make use of software available on

Binary search tree in the descending order, In order to get the information...

In order to get the information kept in a Binary Search Tree in the descending order, one should traverse it in which of the following order?    Right, Root, Left

The disadvantage of specifying parameter, The disadvantage of specifying pa...

The disadvantage of specifying parameter during instantiation are: -  This has a lower precedence when compared to assigning using defparam.

Write Your Message!

Captcha
Free Assignment Quote

Assured A++ Grade

Get guaranteed satisfaction & time on delivery in every assignment order you paid with us! We ensure premium quality solution document along with free turntin report!

All rights reserved! Copyrights ©2019-2020 ExpertsMind IT Educational Pvt Ltd