首页> 外文期刊>JMLR: Workshop and Conference Proceedings >Soft-DTW: a Differentiable Loss Function for Time-Series
【24h】

Soft-DTW: a Differentiable Loss Function for Time-Series

机译:Soft-DTW:时间序列的微分损失函数

获取原文
           

摘要

We propose in this paper a differentiable learning loss between time series, building upon the celebrated dynamic time warping (DTW) discrepancy. Unlike the Euclidean distance, DTW can compare time series of variable size and is robust to shifts or dilatations across the time dimension. To compute DTW, one typically solves a minimal-cost alignment problem between two time series using dynamic programming. Our work takes advantage of a smoothed formulation of DTW, called soft-DTW, that computes the soft-minimum of all alignment costs. We show in this paper that soft-DTW is a differentiable loss function, and that both its value and gradient can be computed with quadratic time/space complexity (DTW has quadratic time but linear space complexity). We show that this regularization is particularly well suited to average and cluster time series under the DTW geometry, a task for which our proposal significantly outperforms existing baselines (Petitjean et al., 2011). Next, we propose to tune the parameters of a machine that outputs time series by minimizing its fit with ground-truth labels in a soft-DTW sense. Source code is available at https://github.com/mblondel/soft-dtw
机译:我们基于著名的动态时间规整(DTW)差异,在时间序列之间提出可区分的学习损失。与欧几里得距离不同,DTW可以比较可变大小的时间序列,并且对时间范围内的位移或扩张具有鲁棒性。为了计算DTW,通常使用动态编程来解决两个时间序列之间的最小成本对齐问题。我们的工作利用了DTW的平滑公式(称为soft-DTW),该公式可计算所有对齐成本的最小值。我们在本文中证明,软DTW是微分损失函数,并且其值和梯度都可以用二次时间/空间复杂度(DTW具有二次时间但线性空间复杂度)来计算。我们表明,这种正则化特别适合于DTW几何条件下的平均时间序列和聚类时间序列,对于这一任务,我们的建议明显优于现有的基线(Petitjean等,2011)。接下来,我们建议通过在软DTW意义上最小化其与地面真实标签的契合度来调整输出时间序列的机器的参数。源代码位于https://github.com/mblondel/soft-dtw

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号