Arbitrary categorisation - learning decision trees, Computer Engineering

Arbitrary categorisation - learning decision trees:

Through visualising  a set of boxes with some balls in. There if all the balls were in a single box so this would be nicely ordered but it would be extremely easy to find a particular ball. Moreover If the balls were distributed amongst the boxes then this would not be so nicely ordered but it might take rather a whereas to find a particular ball. It means if we were going to define a measure based at this notion of purity then we would want to be able to calculate a value for each box based on the number of balls in it so then take the sum of these as the overall measure. Thus we would want to reward two situations: nearly empty boxes as very neat and boxes just with nearly all the balls in as also very neat. However this is the basis for the general entropy measure that is defined follows like: 

Now next here instantly an arbitrary categorisation like C into categories c1, ..., cn and a set of examples, S, for that the proportion of examples in ci is pi, then the entropy of S is as: 

198_Arbitrary categorisation - learning decision trees.png

Here measure satisfies our criteria that is of the -p*log2(p) construction: where p gets close to zero that is the category has only a few examples in it so then the  log(p) becomes a big negative number and the  p  part dominates the calculation then the entropy works out to be nearly zero. However make it sure that entropy calculates the disorder in the data in this low score is good and as it reflects our desire to reward categories with few examples in. Such of similarly if p gets close to 1 then that's the category has most of the examples in so then the  log(p) part gets very close to zero but it  is this that dominates the calculation thus the overall value gets close to zero. Thus we see that both where the category is nearly  -  or completely  -  empty and when the category nearly contains as - or completely contains as  - all the examples and the score for the category gets close to zero that models what we wanted it to. But note that 0*ln(0) is taken to be zero by convention them.

Posted Date: 1/11/2013 6:40:03 AM | Location : United States







Related Discussions:- Arbitrary categorisation - learning decision trees, Assignment Help, Ask Question on Arbitrary categorisation - learning decision trees, Get Answer, Expert's Help, Arbitrary categorisation - learning decision trees Discussions

Write discussion on Arbitrary categorisation - learning decision trees
Your posts are moderated
Related Questions
Q. Illustration of disk formatting? An illustration of disk formatting is displayed in Figure below. In this case every track comprises 30 fixed-length sectors of 600 bytes eac

what is homogeneous coordinate system

Q. Explain XNOR gate with three input variable and draw necessary circuits. Q. Simplify FOLLOWING Using K-Map 1. m0 + m1 + m6 + m7 + m12 + m13 + m8 + m9 2. m0 + m2 + m4 +

What are the layers of data description in R/3? There layesr are there:- The external layer. The ABAP/4 layer. The database layer.

Soundness - artificial intelligence: You may see in some application domains-for example automated theorem proving - that your search is "sound and complete". The soundness in

State about the Object oriented analysis Object oriented analysis (OOA) is concerned with developing software engineering specifications and requirements that expressed as a s

Factory methods that will be used to make objects just like in a static way.

Discuss the functioning of different network access equipments. The E1 multiplexers MX2000 and MX2411 and E1/T1 MX200 are giving multi interface user access to network PDH or S

Q. Use of parallel construct with private clause? In this example we would see use of parallel construct with private and firstprivate clauses. At end of program i and j remain

Explain the operation of octal to binary encoder. Ans Octal to binary encoder consists of eight inputs, one for each of eight digits and three outputs which generate the con