Fusing complete monotonic decision trees

Authors: Hang Xu, Wenjian Wang, Yuhua Qian

Abstract:

Authors:Hang Xu, Wenjian Wang, Yuhua Qian
Abstract:Monotonic classification is a kind of classification task in which a monotonicity constraint exist between features and class, i.e., if sample xi has a higher value in each feature than sample xj, it should be assigned to a class with a higher level than the level of xj’s class. Several methods have been proposed, but they have some limits such as with limited kind of data or limited classification accuracy. In our former work, the classification accuracy on monotonic classification has been improved by fusing monotonic decision trees, but it always has a complex classification model. This work aims to find a monotonic classifier to process both nominal and numeric data by fusing complete monotonic decision trees. Through finding the completed feature subsets based on discernibility matrix on ordinal dataset, a set of monotonic decision trees can be obtained directly and automatically, on which the rank is still preserved. Fewer decision trees are needed, which will serve as base classifiers to construct a decision forest fused complete monotonic decision trees. The experiment results on ten datasets demonstrate that the proposed method can reduce the number of base classifiers effectively and then simplify classification model, and obtain good classification performance simultaneously.
Index Terms—Monotonic classi?cation, decision tree, ensemble learning, feature selection, discernibility matrix.

Fusing complete monotonic decision trees

Keywords:

fusing complete monotonic decision trees.pdf

Sat Jun 24 00:00:00 CST 2017