Example calculation of entropy, Computer Engineering

Assignment Help:

Example Calculation:

If we see an example we are working with a set of examples like S = {s1,s2,s3,s4} categorised with a binary categorisation of positives and negatives like that s1  is positive and the rest are negative. Expect further there that we want to calculate the information gain of an attribute, A, and  A can take the values {v1,v2,v3} obviously. So lat in finally assume that as: 

1745_Example Calculation of Entropy.png

Whether to work out the information gain for A relative to S but we first use to calculate the entropy of S. Means that to use our formula for binary categorisations that we use to know the proportion of positives in S and the proportion of negatives. Thus these are given such as: p+ = 1/4 and p- = 3/4. So then we can calculate as: 

Entropy(S) = -(1/4)log2(1/4) -(3/4)log2(3/4) = -(1/4)(-2) -(3/4)(-0.415) = 0.5 + 0.311

= 0.811 

Now next here instantly note that there to do this calculation into your calculator that you may need to remember that as: log2(x) = ln(x)/ln(2), when ln(2) is the natural log of 2. Next, we need to calculate the weighted Entropy(Sv) for each value v = v1, v2, v3, v4, noting that the weighting involves multiplying by (|Svi|/|S|). Remember also that Sv  is the set of examples from S which have value v for attribute A. This means that:  Sv1 = {s4}, sv2={s1, s2}, sv3 = {s3}. 

We now have need to carry out these calculations: 

(|Sv1|/|S|) * Entropy(Sv1) = (1/4) * (-(0/1)log2(0/1) - (1/1)log2(1/1)) = (1/4)(-0 -

(1)log2(1)) = (1/4)(-0 -0) = 0 

(|Sv2|/|S|) * Entropy(Sv2) = (2/4) * (-(1/2)log2(1/2) - (1/2)log2(1/2))

                                      = (1/2) * (-(1/2)*(-1) - (1/2)*(-1)) = (1/2) * (1) = 1/2 

(|Sv3|/|S|) * Entropy(Sv3) = (1/4) * (-(0/1)log2(0/1) - (1/1)log2(1/1)) = (1/4)(-0 -

(1)log2(1)) = (1/4)(-0 -0) = 0 

Note that we have taken 0 log2(0) to be zero, which is standard. In our calculation,

we only required log2(1) = 0 and log2(1/2) =  -1. We now have to add these three values together and take the result from our calculation for Entropy(S) to give us the final result: 

Gain(S,A) = 0.811 - (0 + 1/2 + 0) = 0.311 

Now we look at how information gain can be utilising in practice in an algorithm to construct decision trees.


Related Discussions:- Example calculation of entropy

What is base register addressing, Q. What is Base Register Addressing ? ...

Q. What is Base Register Addressing ? An addressing technique in which content of an instruction specifies base register is added to address field or displacement field of the

What is the significance timescale directive, What is the significance Time...

What is the significance Timescale directive? Defines time units and simulation precision (smallest increment). Syntax 'timescale TimeUnit / PrecisionUnit TimeUnit =

What is refactoring, What is refactoring? Refactoring is explained as t...

What is refactoring? Refactoring is explained as the changes to the internal structure of software to improve its design without changing its external functionality. It is an e

Posix threads and mutex, The objective of this practical assignment is to u...

The objective of this practical assignment is to use the POSIX environment to write a program that simulates the supply and demand between three processes: warehouse, factory and r

Processor technology , Processor Technology: Computers consist of elec...

Processor Technology: Computers consist of electronic components assembled in a design or "architecture" that will perform necessary functions of input, output, computation an

Explain half-adder with truth-table and logic diagram, What is a half-adder...

What is a half-adder? Explain a half-adder with the help of truth-table and logic diagram. Ans. Half Adder: It is a logic circuit for the addition of two 1-bit numbers is term

Functions to remove common walls, We must also be able to remove common wal...

We must also be able to remove common walls between two cells. Write the function removewalls that accepts two cells and removes the wall that is common between the two (hint: any

Illustrate characteristic tables of flip-flops, Q. Illustrate Characteristi...

Q. Illustrate Characteristic tables of flip-flops? Excitation Tables Characteristic tables of flip-flops present the subsequent state when inputs and present state are kno

What are the categories of radio communication, What are the categories of ...

What are the categories of Radio communication 1.  Sky wave or ionosphere communication 2.  Line-of-sight (LOS) microwave communication limited by horizon 3.  Troposphere

C++, At a shop of marbles, packs of marbles are prepared. Packets are named...

At a shop of marbles, packs of marbles are prepared. Packets are named A, B, C, D, E …….. All packets are kept in a VERTICAL SHELF in random order. Any numbers of packets with thes

Write Your Message!

Captcha
Free Assignment Quote

Assured A++ Grade

Get guaranteed satisfaction & time on delivery in every assignment order you paid with us! We ensure premium quality solution document along with free turntin report!

All rights reserved! Copyrights ©2019-2020 ExpertsMind IT Educational Pvt Ltd