首页> 外文期刊>Knowledge and information systems >Comparison of different weighting schemes for the kNN classifier on time-series data
【24h】

Comparison of different weighting schemes for the kNN classifier on time-series data

机译:kNN分类器在时序数据上不同加权方案的比较

获取原文
获取原文并翻译 | 示例
           

摘要

Many well-known machine learning algorithms have been applied to the task of time-series classification, including decision trees, neural networks, support vector machines and others. However, it was shown that the simple 1-nearest neighbor (1NN) classifier, coupled with an elastic distance measure like Dynamic Time Warping (DTW), often produces better results than more complex classifiers on time-series data, including k-nearest neighbor (kNN) for values of . In this article, we revisit the kNN classifier on time-series data by considering ten classic distance-based vote weighting schemes in the context of Euclidean distance, as well as four commonly used elastic distance measures: DTW, Longest Common Subsequence, Edit Distance with Real Penalty and Edit Distance on Real sequence. Through experiments on the complete collection of UCR time-series datasets, we confirm the view that the 1NN classifier is very hard to beat. Overall, for all considered distance measures, we found that variants of the Dudani weighting scheme produced the best results.
机译:许多众所周知的机器学习算法已应用于时间序列分类任务,包括决策树,神经网络,支持向量机等。但是,结果表明,简单的1最近邻(1NN)分类器,加上诸如动态时间规整(DTW)的弹性距离度量,通常比包括k最近邻的更复杂的时间序列数据分类器产生更好的结果。 (kNN)的值。在本文中,我们通过考虑欧氏距离背景下的十个经典的基于距离的投票加权方案以及四种常用的弹性距离度量值,来重新考虑时间序列数据上的kNN分类器:DTW,最长公共子序列,使用实罚和按实数顺序编辑距离。通过对UCR时间序列数据集的完整收集进行的实验,我们确认了1NN分类器很难击败的观点。总体而言,对于所有考虑的距离度量,我们发现Dudani加权方案的变体产生了最佳结果。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号