Example calculation of entropy, Computer Engineering

Assignment Help:

Example Calculation:

If we see an example we are working with a set of examples like S = {s1,s2,s3,s4} categorised with a binary categorisation of positives and negatives like that s1  is positive and the rest are negative. Expect further there that we want to calculate the information gain of an attribute, A, and  A can take the values {v1,v2,v3} obviously. So lat in finally assume that as: 

1745_Example Calculation of Entropy.png

Whether to work out the information gain for A relative to S but we first use to calculate the entropy of S. Means that to use our formula for binary categorisations that we use to know the proportion of positives in S and the proportion of negatives. Thus these are given such as: p+ = 1/4 and p- = 3/4. So then we can calculate as: 

Entropy(S) = -(1/4)log2(1/4) -(3/4)log2(3/4) = -(1/4)(-2) -(3/4)(-0.415) = 0.5 + 0.311

= 0.811 

Now next here instantly note that there to do this calculation into your calculator that you may need to remember that as: log2(x) = ln(x)/ln(2), when ln(2) is the natural log of 2. Next, we need to calculate the weighted Entropy(Sv) for each value v = v1, v2, v3, v4, noting that the weighting involves multiplying by (|Svi|/|S|). Remember also that Sv  is the set of examples from S which have value v for attribute A. This means that:  Sv1 = {s4}, sv2={s1, s2}, sv3 = {s3}. 

We now have need to carry out these calculations: 

(|Sv1|/|S|) * Entropy(Sv1) = (1/4) * (-(0/1)log2(0/1) - (1/1)log2(1/1)) = (1/4)(-0 -

(1)log2(1)) = (1/4)(-0 -0) = 0 

(|Sv2|/|S|) * Entropy(Sv2) = (2/4) * (-(1/2)log2(1/2) - (1/2)log2(1/2))

                                      = (1/2) * (-(1/2)*(-1) - (1/2)*(-1)) = (1/2) * (1) = 1/2 

(|Sv3|/|S|) * Entropy(Sv3) = (1/4) * (-(0/1)log2(0/1) - (1/1)log2(1/1)) = (1/4)(-0 -

(1)log2(1)) = (1/4)(-0 -0) = 0 

Note that we have taken 0 log2(0) to be zero, which is standard. In our calculation,

we only required log2(1) = 0 and log2(1/2) =  -1. We now have to add these three values together and take the result from our calculation for Entropy(S) to give us the final result: 

Gain(S,A) = 0.811 - (0 + 1/2 + 0) = 0.311 

Now we look at how information gain can be utilising in practice in an algorithm to construct decision trees.


Related Discussions:- Example calculation of entropy

Internal conditioning, types of internal conditioning with explanations and...

types of internal conditioning with explanations and examples

Scoreboards- constrained-random verification methodology, Scoreboards- Cons...

Scoreboards- Constrained-Random Verification Methodology Scoreboards are used to verify that data has successfully reached its destination whereas monitors snoop the interfaces

Explain the significance of encryption, Problem: (i) What are the main...

Problem: (i) What are the main threats that an organisation holding sensitive data, such as Public Data, on computer storage must guard against? (ii) To protect such data,

What are types of applets, There are two different parts of applets. Truste...

There are two different parts of applets. Trusted Applets and Untrusted applets. Trusted Applets are applets with predefined security and Untrusted Applets are applets without any

Explain the matlab mathematical function library?, This is a huge collectio...

This is a huge collection of computational algorithms ranging from elementary functions like sum, sine, cosine, and difficult arithmetic, to more sophisticated functions like matri

Which data structures used in language processing, Which structure can be u...

Which structure can be used as a criterion for classification of data structures used in language processing. And. Nature of a data structure, purpose of a data structure and l

Determine the binary equivalent of hexadecimal FA, The binary equivalent of...

The binary equivalent of (FA) 16   is ? Ans. (FA) 16 = (11111010) 10

Identifying stakeholders, I have chosen an imaginary example to illustrate...

I have chosen an imaginary example to illustrate the different stakeholder categories. At Home sells a selection of consumer products, currently listed in a paper catalogue, whi

Application layer in tcp/ip model corresponds to for what, Application laye...

Application layer (layer 4) in TCP/IP model corresponds to? In OSI model, application layer (layer 4) in TCP/IP model corresponds to layer 6 and layer 7.

Write Your Message!

Captcha
Free Assignment Quote

Assured A++ Grade

Get guaranteed satisfaction & time on delivery in every assignment order you paid with us! We ensure premium quality solution document along with free turntin report!

All rights reserved! Copyrights ©2019-2020 ExpertsMind IT Educational Pvt Ltd