首页> 外文期刊>IEEE Transactions on Neural Networks >Voronoi networks and their probability of misclassification
【24h】

Voronoi networks and their probability of misclassification

机译:Voronoi网络及其分类错误的可能性

获取原文
获取原文并翻译 | 示例

摘要

To reduce the memory requirements and the computation cost, many algorithms have been developed that perform nearest neighbor classification using only a small number of representative samples obtained from the training set. We call the classification model underlying all these algorithms as Voronoi networks (Vnets). We analyze the generalization capabilities of these networks by bounding the generalization error. The class of problems that can be solved by Vnets is characterized by the extent to which the set of points on the decision boundaries fill the feature space. We show that Vnets asymptotically converge to the Bayes classifier with arbitrarily high probability provided the number of representative samples grow slower than the square root of the number of training samples and also give the optimal growth rate of the number of representative samples. We redo the analysis for decision tree (DT) classifiers and compare them with Vnets. The bias/variance dilemma and the curse of dimensionality with respect to Vnets and DTs are also discussed.
机译:为了减少内存需求和计算成本,已经开发了许多算法,这些算法仅使用从训练集中获得的少量代表性样本执行最近邻分类。我们将所有这些算法的基础分类模型称为Voronoi网络(Vnets)。我们通过限制泛化误差来分析这些网络的泛化能力。 Vnets可以解决的这类问题的特征在于决策边界上的点集填充特征空间的程度。我们显示,只要代表性样本的数量比训练样本数量的平方根增长得慢,并且给出代表性样本数量的最佳增长率,Vnets便以任意高的概率渐近收敛到贝叶斯分类器。我们重做决策树(DT)分类器的分析,并将其与Vnets进行比较。还讨论了有关Vnets和DT的偏见/方差困境和维数诅咒。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号