首页> 外文期刊>Neurocomputing >Local Partial Least Square classifier in high dimensionality classification
【24h】

Local Partial Least Square classifier in high dimensionality classification

机译:高维分类中的局部偏最小二乘分类器

获取原文
获取原文并翻译 | 示例
       

摘要

A central idea in distance-based machine learning algorithms such k-nearest neighbors and manifold learning is to choose a set of references, or a neighborhood, based on a distance functions to represent the local structure around a query point and use the local structures as the basis to construct models. Local Partial Least Square (local PLS), which is the result of applying this neighborhood based idea in Partial Least Square (PLS), has been shown to perform very well on the regression of small-sample sized and multicollinearity data, but seldom used in high-dimensionality classification. Furthermore the difference between PLS and local PLS with respect to their optimal intrinsic dimensions is unclear. In this paper we combine local PLS with non-Euclidean distance in order to find out which measures are better suited for high dimensionality classification. Experimental results obtained on 8 UCI and spectroscopy datasets show that the Euclidean distance is not a good distance fiinction for use in local PLS classification, especially in high dimensionality cases; instead Manhattan distance and fractional distance are preferred. Experimental results further show that the optimal intrinsic dimension of local PLS is smaller than that of the standard PLS.
机译:基于距离的机器学习算法(例如k近邻和流形学习)的中心思想是基于距离函数选择一组引用或邻域,以表示查询点周围的局部结构并将局部结构用作建立模型的基础。局部偏最小二乘(local PLS)是在偏最小二乘(PLS)中应用基于邻域的思想的结果,已显示在小样本规模和多重共线性数据的回归上表现很好,但很少用于高维分类。此外,关于PLS和局部PLS的最佳固有尺寸之间的差异尚不清楚。在本文中,我们将局部PLS与非欧几里得距离相结合,以找出哪种方法更适合于高维分类。在8个UCI和光谱数据集上获得的实验结果表明,欧氏距离并不是用于局部PLS分类的良好距离函数,尤其是在高维情况下;取而代之的是曼哈顿距离和分数距离。实验结果进一步表明,局部PLS的最佳内在维数小于标准PLS的内在维数。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号