Example calculation of entropy, Computer Engineering

Assignment Help:

Example Calculation:

If we see an example we are working with a set of examples like S = {s1,s2,s3,s4} categorised with a binary categorisation of positives and negatives like that s1  is positive and the rest are negative. Expect further there that we want to calculate the information gain of an attribute, A, and  A can take the values {v1,v2,v3} obviously. So lat in finally assume that as: 

1745_Example Calculation of Entropy.png

Whether to work out the information gain for A relative to S but we first use to calculate the entropy of S. Means that to use our formula for binary categorisations that we use to know the proportion of positives in S and the proportion of negatives. Thus these are given such as: p+ = 1/4 and p- = 3/4. So then we can calculate as: 

Entropy(S) = -(1/4)log2(1/4) -(3/4)log2(3/4) = -(1/4)(-2) -(3/4)(-0.415) = 0.5 + 0.311

= 0.811 

Now next here instantly note that there to do this calculation into your calculator that you may need to remember that as: log2(x) = ln(x)/ln(2), when ln(2) is the natural log of 2. Next, we need to calculate the weighted Entropy(Sv) for each value v = v1, v2, v3, v4, noting that the weighting involves multiplying by (|Svi|/|S|). Remember also that Sv  is the set of examples from S which have value v for attribute A. This means that:  Sv1 = {s4}, sv2={s1, s2}, sv3 = {s3}. 

We now have need to carry out these calculations: 

(|Sv1|/|S|) * Entropy(Sv1) = (1/4) * (-(0/1)log2(0/1) - (1/1)log2(1/1)) = (1/4)(-0 -

(1)log2(1)) = (1/4)(-0 -0) = 0 

(|Sv2|/|S|) * Entropy(Sv2) = (2/4) * (-(1/2)log2(1/2) - (1/2)log2(1/2))

                                      = (1/2) * (-(1/2)*(-1) - (1/2)*(-1)) = (1/2) * (1) = 1/2 

(|Sv3|/|S|) * Entropy(Sv3) = (1/4) * (-(0/1)log2(0/1) - (1/1)log2(1/1)) = (1/4)(-0 -

(1)log2(1)) = (1/4)(-0 -0) = 0 

Note that we have taken 0 log2(0) to be zero, which is standard. In our calculation,

we only required log2(1) = 0 and log2(1/2) =  -1. We now have to add these three values together and take the result from our calculation for Entropy(S) to give us the final result: 

Gain(S,A) = 0.811 - (0 + 1/2 + 0) = 0.311 

Now we look at how information gain can be utilising in practice in an algorithm to construct decision trees.


Related Discussions:- Example calculation of entropy

How will you fine-tune a class, How will you fine-tune a class? It is u...

How will you fine-tune a class? It is useful to fine-tune classes before writing code in order to examine development or to improve performance. The purpose of execution is to

Explain topology method used in lan technology in detail, Explain topology ...

Explain topology method used in LAN technology in detail. LAN Topologies: Network topology is a physical schematic that shows interconnection of the various users. There are

Computer graphics, What is intractive graphics and user dialouge in comput...

What is intractive graphics and user dialouge in computer graphics

Information technology or information system, Information Technology or Inf...

Information Technology or Information System Before going further we need to clarify what we mean by the term information system. It is easy to confuse terminology as many tex

Software engineering, specialization,ggeneralization and aggregation of rai...

specialization,ggeneralization and aggregation of railway reservation system?

Define the difference between static ram and dynamic ram, Define the differ...

Define the difference between static RAM and dynamic RAM? The RAM family comprises two important memory devices that are static RAM (SRAM) and dynamic RAM (DRAM). The main diff

Explain design parameters, Explain the following design parameters S, ...

Explain the following design parameters S, SC, TC, C, CCI, EUF, K, T S The various terms are given below: S: Total number of switching components A good design sh

What is pattern, What is pattern? A pattern is a proven solution to a g...

What is pattern? A pattern is a proven solution to a general problem. Lots of patterns are used. There are patterns for analysis, architecture, design and execution. Patterns c

Methods for organising the associative memory, There are two methods for or...

There are two methods for organising the associative memory based on bit slices: Bit parallel organisation: In this organisation every bit slices which are not masked off,

What are the disadvantages of linux, What are the disadvantages of Linux? ...

What are the disadvantages of Linux? Learning Lack of comparable programs More technical skill required

Write Your Message!

Captcha
Free Assignment Quote

Assured A++ Grade

Get guaranteed satisfaction & time on delivery in every assignment order you paid with us! We ensure premium quality solution document along with free turntin report!

All rights reserved! Copyrights ©2019-2020 ExpertsMind IT Educational Pvt Ltd