首页> 外文会议>AI 2010: Advances in artificial intelligence >On Optimizing Locally Linear Nearest Neighbour Reconstructions Using Prototype Reduction Schemes
【24h】

On Optimizing Locally Linear Nearest Neighbour Reconstructions Using Prototype Reduction Schemes

机译:利用原型约简方案优化局部线性最近邻重构

获取原文
获取原文并翻译 | 示例
获取外文期刊封面目录资料

摘要

This paper concerns the use of Prototype Reduction Schemes (PRS) to optimize the computations involved in typical k-Nearest Neighbor (k-NN) rules. These rules have been successfully used for decades in statistical Pattern Recognition (PR) applications, and have numerous applications because of their known error bounds. For a given data point of unknown identity, the fc-NN possesses the phenomenon that it combines the information about the samples from a priori target classes (values) of selected neighbors to, for example, predict the target class of the tested sample. Recently, an implementation of the fc-NN, named as the Locally Linear Reconstruction (LLR) [11], has been proposed. The salient feature of the latter is that by invoking a quadratic optimization process, it is capable of systematically setting model parameters, such as the number of neighbors (specified by the parameter, k) and the weights. However, the LLR takes more time than other conventional methods when it has to be applied to classification tasks. To overcome this problem, we propose a strategy of using a PRS to efficiently compute the optimization problem. In this paper, we demonstrate, first of all, that by completely discarding the points not included by the PRS, we can obtain a reduced set of sample points, using which, in turn, the quadratic optimization problem can be computed far more expediently. The values of the corresponding indices are comparable to those obtained with the original training set (i.e., the one which considers all the data points) even though the computations required to obtain the prototypes and the corresponding classification accuracies are noticeably less. The proposed method has been tested on artificial and real-life data sets, and the results obtained are very promising, and has potential in PR applications.
机译:本文涉及原型缩减方案(PRS)的使用,以优化典型k最近邻居(k-NN)规则中涉及的计算。这些规则已在统计模式识别(PR)应用程序中成功使用了数十年,并且由于其已知的错误范围而得到了广泛的应用。对于未知身份的给定数据点,fc-NN具有以下现象:它将来自选定邻居的先验目标类别(值)的样本信息组合在一起,例如,预测测试样本的目标类别。最近,有人提出了一种称为局部线性重构(LLR)[11]的fc-NN的实现。后者的显着特征是,通过调用二次优化过程,它能够系统地设置模型参数,例如邻居数(由参数k指定)和权重。但是,当必须将LLR应用于分类任务时,它会比其他常规方法花费更多时间。为了克服这个问题,我们提出了一种使用PRS来有效计算优化问题的策略。在本文中,我们首先证明,通过完全丢弃PRS中未包含的点,我们可以获得一组减少的采样点,从而可以更方便地计算二次优化问题。即使获得原型和相应分类精度所需的计算量明显减少,相应索引的值也可以与原始训练集(即考虑所有数据点的训练集)获得的值相比。所提出的方法已经在人工和现实数据集上进行了测试,获得的结果非常有前途,并且在PR应用中具有潜力。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号