This paper discusses a dynamic path-based mathod for constructing conjunctions as new attributes for decision tree learning. It searches for conditions (attribute-value pairs) from paths to form new attributes. CAT, a constructive decision tree learning algorithm, which adopts this dynamic path-based method is described. It employs the hypothesis-driven strategy for constructing new attributes, and uses conjunction and negation (implicitly) as its constructive operators. Compared with other hypothesis-driven constructive decision tree learning algorithms such as algo-rithms of the FRINGE family, the new idea of CAT is that it carries out systematic search with pruning over each path of a tree to select conditions for generating a conjunction. Therefore, in CAT, conditions for constructing new attributes are decided dynamically during search. Empirically investigation in a set of artificial and real-world domains shows that CAT can improve the performance of selective decision tree learning in terms of both higher prediction accuracy and lower theory complexity. In addition, it shows some performance advantages over the construc-tive decision tree learning algorithms that use a fixed path-based method and a fixed rule-based method to construct conjunctions as new attributes.
展开▼