首页> 外文会议>International Joint Conference on Neural Networks >Do Fractional Norms and Quasinorms Help to Overcome the Curse of Dimensionality?
【24h】

Do Fractional Norms and Quasinorms Help to Overcome the Curse of Dimensionality?

机译:做分数规范和Quasinorms有助于克服维度的诅咒吗?

获取原文

摘要

The curse of dimensionality causes well-known and widely discussed problems for machine learning methods. There is a hypothesis that usage of Manhattan distance and even fractional quasinorms lp (for p less than 1) can help to overcome the curse of dimensionality in classification problems. In this study, we systematically test this hypothesis for 37 binary classification problems on 25 databases. We confirm that fractional quasinorms have greater relative contrast or coefficient of variation than Euclidean norm l2, but we demonstrate also that the distance concentration shows qualitatively the same behaviour for all tested norms and quasinorms and the difference between them decays while dimension tends to infinity. Estimation of classification quality for kNN based on different norms and quasinorms shows that the greater relative contrast does not mean the better classifier performance and the worst performance for different databases was shown by the different norms (quasinorms). A systematic comparison shows that the difference in performance of kNN based on lp for p=2, 1, and 0.5 is statistically insignificant.
机译:维度的诅咒导致机器学习方法众所周知的和广泛讨论的问题。有一个假设,使用曼哈顿距离和甚至小于1)的使用情况可以帮助克服分类问题中的维度的诅咒。在本研究中,我们系统地测试了25个数据库的37个二进制分类问题的假设。我们确认分数Quasinorms具有比Euclidean规范L2更大的相对对比度或变异系数,但我们也证明了距离浓度显示出所有测试规范和Quasinorms的定性相同的行为以及它们之间的差异,而尺寸趋于无穷大。基于不同规范和Quasinorms的KNN分类质量的估计表明,更大的相对对比并不意味着更好的分类器性能和不同的数据库的最差性能由不同的规范(Quasinorms)表示。系统的比较表明,基于LP对于P = 2,1和0.5的KNN性能差异是统计上微不足道的。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号