Example calculation of entropy, Computer Engineering

Assignment Help:

Example Calculation:

If we see an example we are working with a set of examples like S = {s1,s2,s3,s4} categorised with a binary categorisation of positives and negatives like that s1  is positive and the rest are negative. Expect further there that we want to calculate the information gain of an attribute, A, and  A can take the values {v1,v2,v3} obviously. So lat in finally assume that as: 

1745_Example Calculation of Entropy.png

Whether to work out the information gain for A relative to S but we first use to calculate the entropy of S. Means that to use our formula for binary categorisations that we use to know the proportion of positives in S and the proportion of negatives. Thus these are given such as: p+ = 1/4 and p- = 3/4. So then we can calculate as: 

Entropy(S) = -(1/4)log2(1/4) -(3/4)log2(3/4) = -(1/4)(-2) -(3/4)(-0.415) = 0.5 + 0.311

= 0.811 

Now next here instantly note that there to do this calculation into your calculator that you may need to remember that as: log2(x) = ln(x)/ln(2), when ln(2) is the natural log of 2. Next, we need to calculate the weighted Entropy(Sv) for each value v = v1, v2, v3, v4, noting that the weighting involves multiplying by (|Svi|/|S|). Remember also that Sv  is the set of examples from S which have value v for attribute A. This means that:  Sv1 = {s4}, sv2={s1, s2}, sv3 = {s3}. 

We now have need to carry out these calculations: 

(|Sv1|/|S|) * Entropy(Sv1) = (1/4) * (-(0/1)log2(0/1) - (1/1)log2(1/1)) = (1/4)(-0 -

(1)log2(1)) = (1/4)(-0 -0) = 0 

(|Sv2|/|S|) * Entropy(Sv2) = (2/4) * (-(1/2)log2(1/2) - (1/2)log2(1/2))

                                      = (1/2) * (-(1/2)*(-1) - (1/2)*(-1)) = (1/2) * (1) = 1/2 

(|Sv3|/|S|) * Entropy(Sv3) = (1/4) * (-(0/1)log2(0/1) - (1/1)log2(1/1)) = (1/4)(-0 -

(1)log2(1)) = (1/4)(-0 -0) = 0 

Note that we have taken 0 log2(0) to be zero, which is standard. In our calculation,

we only required log2(1) = 0 and log2(1/2) =  -1. We now have to add these three values together and take the result from our calculation for Entropy(S) to give us the final result: 

Gain(S,A) = 0.811 - (0 + 1/2 + 0) = 0.311 

Now we look at how information gain can be utilising in practice in an algorithm to construct decision trees.


Related Discussions:- Example calculation of entropy

Computer Architecture, As an advocate of CISC architecture to RISC architec...

As an advocate of CISC architecture to RISC architecture, what are the merits and demerits of CISC to RISC architecture

Explain implementation of circuits from boolean expression, Explain Impleme...

Explain Implementation of the Circuits From the Boolean Expression? If the operation of the circuit is defined by a Boolean expression, a logic-circuit diagram can he implement

Design tables in html, In this portion you would see how to put tables in y...

In this portion you would see how to put tables in your web documents. It isn't that a table is simply a combination of rows and columns. If you have ever seen any table in an attr

Name various work processes of r/3 system, Name various work processes of R...

Name various work processes of R/3 system? A) Dialog or Online (processes only one request at a time). B) Background (Started at a specific time) C) Update (primary or se

What is magento, Magento is a feature-rich eCommerce platform built on open...

Magento is a feature-rich eCommerce platform built on open-source technology that gives online merchants with unprecedented flexibility and control over the look, content and funct

Use of delay loops, A very useful application of assembly is to generate de...

A very useful application of assembly is to generate delay loops. These loops are used for waiting for some time before execution of subsequent instruction. However how to find

Differentiate between static and dynamic step loops, Differentiate between ...

Differentiate between static and dynamic step loops. Step loops fall into two classes: Static and Dynamic.  Static step loops have a fixed size that cannot be changed at runti

Why erlang is used, Erlang is used to (A) Measure busy period         ...

Erlang is used to (A) Measure busy period                (B) Give total busy period in minutes (C)  Measure average call rate       (D) Indicate total call period Ans

Split bus operation - universal serial bus , Split Bus Operation - universa...

Split Bus Operation - universal serial bus :   USB 2.0 devices utilize a special protocol in the reset time that is called "chirping", to negotiate the high speed mode

Write Your Message!

Captcha
Free Assignment Quote

Assured A++ Grade

Get guaranteed satisfaction & time on delivery in every assignment order you paid with us! We ensure premium quality solution document along with free turntin report!

All rights reserved! Copyrights ©2019-2020 ExpertsMind IT Educational Pvt Ltd