We describe the Hierarchical Classifier (HC), which is a hybrid architecture [1] built with the help of supervised training and unsu-pervised problem clustering. We prove a theorem giving the estimation R of HC risk. The proof works because of an improved way of computing cluster weights, introduced in this paper. Experiments show that R is correlated with HC real error. This allows us to use R as the approximation of HC risk without evaluating HC subclusters. We also show how R can be used in efficient clustering algorithms by comparing HC architectures with different methods of clustering.
展开▼