首页> 外文会议>Computer Vision and Pattern Recognition (CVPR), 2012 IEEE Conference on >Iterative Nearest Neighbors for classification and dimensionality reduction
【24h】

Iterative Nearest Neighbors for classification and dimensionality reduction

机译:迭代最近邻居,用于分类和降维

获取原文
获取原文并翻译 | 示例

摘要

Representative data in terms of a set of selected samples is of interest for various machine learning applications, e.g. dimensionality reduction and classification. The best-known techniques probably still are k-Nearest Neighbors (kNN) and its variants. Recently, richer representations have become popular. Examples are methods based on l1-regularized least squares (Sparse Representation (SR)) or l2-regularized least squares (Collaborative Representation (CR)), or on l1-constrained least squares (Local Linear Embedding (LLE)). We propose Iterative Nearest Neighbors (INN). This is a novel sparse representation that combines the power of SR and LLE with the computational simplicity of kNN. We test our method in terms of dimensionality reduction and classification, using standard benchmarks such as faces (AR), traffic signs (GTSRB), and PASCAL VOC 2007. INN performs better than NN and comparable with CR and SR, while being orders of magnitude faster than the latter.
机译:以一组选定样本为代表的代表性数据对于各种机器学习应用都是有意义的。降维和分类。最著名的技术可能仍然是k最近邻居(kNN)及其变体。近来,更丰富的表示法变得流行了。示例是基于l1规则化最小二乘(稀疏表示(SR))或l2规则化最小二乘(协作表示(CR))或基于l1约束的最小二乘(局部线性嵌入(LLE))的方法。我们提出了迭代最近邻居(INN)。这是一种新颖的稀疏表示,将SR和LLE的功能与kNN的计算简单性结合在一起。我们使用诸如面孔(AR),交通标志(GTSRB)和PASCAL VOC 2007等标准基准测试了方法的降维和分类。INN的性能优于NN,可与CR和SR媲美,但数量级较高比后者快。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号