首页> 外文会议>International conference on brain informatics >Improving SNR and Reducing Training Time of Classifiers in Large Datasets via Kernel Averaging
【24h】

Improving SNR and Reducing Training Time of Classifiers in Large Datasets via Kernel Averaging

机译:通过内核平均提高大数据集中的信噪比并减少分类器的训练时间

获取原文
获取外文期刊封面目录资料

摘要

Kernel methods are of growing importance in neuroscience research. As an elegant extension of linear methods, they are able to model complex non-linear relationships. However, since the kernel matrix grows with data size, the training of classifiers is computationally demanding in large datasets. Here, a technique developed for linear classifiers is extended to kernel methods: In linearly separable data, replacing sets of instances by their averages improves signal-to-noise ratio (SNR) and reduces data size. In kernel methods, data is linearly non-separable in input space, but linearly separable in the high-dimensional feature space that kernel methods implicitly operate in. It is shown that a classifier can be efficiently trained on instances averaged in feature space by averaging entries in the kernel matrix. Using artificial and publicly available data, it is shown that kernel averaging improves classification performance substantially and reduces training time, even in non-linearly separable data.
机译:内核方法在神经科学研究中越来越重要。作为线性方法的完美扩展,它们能够对复杂的非线性关系进行建模。但是,由于核矩阵随数据大小的增长而增长,因此在大型数据集中对分类器的训练在计算上要求很高。在这里,为线性分类器开发的技术扩展到了核方法:在线性可分离的数据中,用实例的平均值替换实例集可以改善信噪比(SNR)并减小数据大小。在内核方法中,数据在输入空间中不可线性分离,但在内核方法隐式运行于其中的高维特征空间中可线性分离。结果表明,通过对条目进行平均,可以对特征空间中平均的实例进行有效的分类器训练在内核矩阵中。使用人工的和公开可用的数据,可以证明,即使在非线性可分离的数据中,核平均也显着提高了分类性能,并减少了训练时间。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号