Arbitrary categorisation - learning decision trees, Computer Engineering

Arbitrary categorisation - learning decision trees:

Through visualising  a set of boxes with some balls in. There if all the balls were in a single box so this would be nicely ordered but it would be extremely easy to find a particular ball. Moreover If the balls were distributed amongst the boxes then this would not be so nicely ordered but it might take rather a whereas to find a particular ball. It means if we were going to define a measure based at this notion of purity then we would want to be able to calculate a value for each box based on the number of balls in it so then take the sum of these as the overall measure. Thus we would want to reward two situations: nearly empty boxes as very neat and boxes just with nearly all the balls in as also very neat. However this is the basis for the general entropy measure that is defined follows like: 

Now next here instantly an arbitrary categorisation like C into categories c1, ..., cn and a set of examples, S, for that the proportion of examples in ci is pi, then the entropy of S is as: 

198_Arbitrary categorisation - learning decision trees.png

Here measure satisfies our criteria that is of the -p*log2(p) construction: where p gets close to zero that is the category has only a few examples in it so then the  log(p) becomes a big negative number and the  p  part dominates the calculation then the entropy works out to be nearly zero. However make it sure that entropy calculates the disorder in the data in this low score is good and as it reflects our desire to reward categories with few examples in. Such of similarly if p gets close to 1 then that's the category has most of the examples in so then the  log(p) part gets very close to zero but it  is this that dominates the calculation thus the overall value gets close to zero. Thus we see that both where the category is nearly  -  or completely  -  empty and when the category nearly contains as - or completely contains as  - all the examples and the score for the category gets close to zero that models what we wanted it to. But note that 0*ln(0) is taken to be zero by convention them.

Posted Date: 1/11/2013 6:40:03 AM | Location : United States







Related Discussions:- Arbitrary categorisation - learning decision trees, Assignment Help, Ask Question on Arbitrary categorisation - learning decision trees, Get Answer, Expert's Help, Arbitrary categorisation - learning decision trees Discussions

Write discussion on Arbitrary categorisation - learning decision trees
Your posts are moderated
Related Questions
what is the basic diff between mbr and mdr

Q. Explain about IFRAME? is an HTML 4.0 addition to frames toolbox. Presently only MSIE supports . Unlike frames created employing and

An exchange uses a -40 V battery to drive subscriber lines. A resistance of 250 ohms is placed in series with the battery to protect it from short circuits. The subscribers are req

The workings of LoadRunner are The Virtual User Generator, Controller, and the Agent process, LoadRunner examines and Monitoring, LoadRunner Books Online. What Component of LoadRun

What are the different types of distributing frames used in exchanges? The various distribution frames used in exchange are demonstrated in figure. Every subscriber in a telep

Analysis and design form the basis on any significant software artifact. Analysis is critical in terms of making sure that the final artifact actually meets user requirements (ie b

Background, Examples and Hypothesis: Now we will switch off with three logic programs. So very firstly, we will have the logic program representing a set of positive examples

Q. Explain about Interlacing? Interlacing is a procedure in which in place of scanning the image one-line-at-a-time it's scanned alternatelyit implies thatalternate lines are s

Q. Routines which handle dynamic processes? number of routines which handle dynamic processes:  int pvm_joingroup( char *group ) Enrolls calling process in a na

minimum number of shelves