Example calculation of entropy, Computer Engineering

Assignment Help:

Example Calculation:

If we see an example we are working with a set of examples like S = {s1,s2,s3,s4} categorised with a binary categorisation of positives and negatives like that s1  is positive and the rest are negative. Expect further there that we want to calculate the information gain of an attribute, A, and  A can take the values {v1,v2,v3} obviously. So lat in finally assume that as: 

1745_Example Calculation of Entropy.png

Whether to work out the information gain for A relative to S but we first use to calculate the entropy of S. Means that to use our formula for binary categorisations that we use to know the proportion of positives in S and the proportion of negatives. Thus these are given such as: p+ = 1/4 and p- = 3/4. So then we can calculate as: 

Entropy(S) = -(1/4)log2(1/4) -(3/4)log2(3/4) = -(1/4)(-2) -(3/4)(-0.415) = 0.5 + 0.311

= 0.811 

Now next here instantly note that there to do this calculation into your calculator that you may need to remember that as: log2(x) = ln(x)/ln(2), when ln(2) is the natural log of 2. Next, we need to calculate the weighted Entropy(Sv) for each value v = v1, v2, v3, v4, noting that the weighting involves multiplying by (|Svi|/|S|). Remember also that Sv  is the set of examples from S which have value v for attribute A. This means that:  Sv1 = {s4}, sv2={s1, s2}, sv3 = {s3}. 

We now have need to carry out these calculations: 

(|Sv1|/|S|) * Entropy(Sv1) = (1/4) * (-(0/1)log2(0/1) - (1/1)log2(1/1)) = (1/4)(-0 -

(1)log2(1)) = (1/4)(-0 -0) = 0 

(|Sv2|/|S|) * Entropy(Sv2) = (2/4) * (-(1/2)log2(1/2) - (1/2)log2(1/2))

                                      = (1/2) * (-(1/2)*(-1) - (1/2)*(-1)) = (1/2) * (1) = 1/2 

(|Sv3|/|S|) * Entropy(Sv3) = (1/4) * (-(0/1)log2(0/1) - (1/1)log2(1/1)) = (1/4)(-0 -

(1)log2(1)) = (1/4)(-0 -0) = 0 

Note that we have taken 0 log2(0) to be zero, which is standard. In our calculation,

we only required log2(1) = 0 and log2(1/2) =  -1. We now have to add these three values together and take the result from our calculation for Entropy(S) to give us the final result: 

Gain(S,A) = 0.811 - (0 + 1/2 + 0) = 0.311 

Now we look at how information gain can be utilising in practice in an algorithm to construct decision trees.


Related Discussions:- Example calculation of entropy

Find 9''s complement for decimal number, Q. Find 9's complement for decimal...

Q. Find 9's complement for decimal number? The 9's complement is achieved by subtracting every digit of number from 9 (the highest digit value). Let's assume that we want to si

What is hysteresis, What is hysteresis? Hysteresis is well known in fer...

What is hysteresis? Hysteresis is well known in ferromagnetic materials. When an external magnetic field is applied to a Ferro magnet, the atomic dipoles align themselves with

Define the do while loop, The do while Loop This is very similar to the...

The do while Loop This is very similar to the while loop except that the test occurs at the end of the loop body. This guarantees that the loop is executed at least once before

Color scheme in a repeater control, How can you provide an alternating colo...

How can you provide an alternating color scheme in a Repeater control?  AlternatingItemTemplate Like the ItemTemplate element, but rendered for every otherrow (alternating item

How to format expressions by using vba, Format functions can be used to for...

Format functions can be used to format lot of the expressions such as money, time, date, percentages and numbers. These functions are much easier to use in VBA. User defined date,

What is DRAM and what do you understand by DRAM refreshing, What is DRAM? ...

What is DRAM? What do you understand by DRAM refreshing? With the help of a block diagram, demonstrate how DRAM can be interfaced to a microprocessor. Dynamic RAM (DRAM) is bas

Explain about the term e-brokerage briefly, Explain about the term E-broker...

Explain about the term E-brokerage briefly. An e-brokerage is an investment house which allows you to buy and sell stocks and acquire investment information through its Web sit

Define briefly about extranet, Define briefly about Extranet. Extrane...

Define briefly about Extranet. Extranet: Extranet is Extension of an Intranet which makes the latter available to outside companies or individuals along with or without

Information gain, Information Gain: Now next here instantly return to ...

Information Gain: Now next here instantly return to the problem of trying to determine the best attribute to choose for a particular node in a tree. As in the following measur

What are the advantages of interviewing, What are the Advantages of Intervi...

What are the Advantages of Interviewing - Opportunity to motivate interviewee to give open and free answers to analyst's questions    -  allows analyst to probe for more f

Write Your Message!

Captcha
Free Assignment Quote

Assured A++ Grade

Get guaranteed satisfaction & time on delivery in every assignment order you paid with us! We ensure premium quality solution document along with free turntin report!

All rights reserved! Copyrights ©2019-2020 ExpertsMind IT Educational Pvt Ltd