Chapter 1 Decision Trees
Section 3 Efficient Decision Tree Construction
Page 4 Attribute Selection Measures

Objectives:

The objectives of this section are:
to introduce you to the various attribute types and the ways they can be split
to present you with an algorithm for creating a decision tree classifier
to determine the best split for a given node
to inform you of the problems that arise with the algorithm and how they are addressed.

Outcomes:

By the time you have completed this section you will be able to:
compare decision trees and decide whether or not they are efficient
explain from a high level Hunts’ algorithms and the difficulties encountered
calculate the impurity of a node by using the measures outlined in this section
compare and contrast the measures of impurity

Attribute Selection Measures

Entropy and Classification Errors

The slideshow presentation below presents you with the other attribute selection measures addressed in this course. The first is Entropy which is word you might be familiar with if you have a background in chemistry and the second is Classfication Errors.

Content on this page requires a newer version of Adobe Flash Player.

Get Adobe Flash player

 

 

Efficient Construction Application

The calculations may seem easy and the concept simple, before moving on, let's see if you can construct a decision tree for a particular dataset using the attribute selection measures describe above. Click the link below to download the Decision Tree Builder that will put you in the driver seat. For instructions on how to use the Decision Tree Builder Application consult Help Section 4.

Launches the Decision Tree Builder