首页> 外文会议>Conference on Neural Information Processing Systems >Enhancing the Locality and Breaking the Memory Bottleneck of Transformer on Time Series Forecasting
【24h】

Enhancing the Locality and Breaking the Memory Bottleneck of Transformer on Time Series Forecasting

机译:增强局部性,并在时间序列预测上打破变压器的存储器瓶颈

获取原文

摘要

Time series forecasting is an important problem across many domains, including predictions of solar plant energy output, electricity consumption, and traffic jam situation. In this paper, we propose to tackle such forecasting problem with Transformer [1]. Although impressed by its performance in our preliminary study, we found its two major weaknesses: (1) locality-agnostics: the point-wise dot-product self-attention in canonical Transformer architecture is insensitive to local context, which can make the model prone to anomalies in time series; (2) memory bottleneck: space complexity of canonical Transformer grows quadratically with sequence length L, making directly modeling long time series infeasible. In order to solve these two issues, we first propose convolutional self-attention by producing queries and keys with causal convolution so that local context can be better incorporated into attention mechanism. Then, we propose LogSparse Transformer with only O(L(log L)~2) memory cost, improving forecasting accuracy for time series with fine granularity and strong long-term dependencies under constrained memory budget. Our experiments on both synthetic data and real-world datasets show that it compares favorably to the state-of-the-art.
机译:时间序列预测是许多领域的重要问题,包括太阳能植物能量输出,电力消耗和交通堵塞情况的预测。在本文中,我们建议用变压器解决此类预测问题[1]。虽然在我们的初步研究中的表现印象深刻,但我们发现了它的两个主要弱点:(1)地区 - 不可知论:规范变压器架构中的点亮点 - 产品自我关注对本地背景不敏感,这可以使模型变得容易发生到时间序列的异常; (2)内存瓶颈:规范变压器的空间复杂性,用序列长度立方呈二次,直接建模长时间序列不可行。为了解决这两个问题,我们首先通过产生因果卷积的疑问和钥匙来提出卷积自我关注,以便可以更好地纳入注意机制。然后,我们提出了LogSparsy变压器,仅具有O(l(log l)〜2)内存成本,提高了与细粒度和强大的内存预算下的微粒度和强度依赖性的时间序列的预测精度。我们对合成数据和现实世界数据集的实验表明它对最先进的方式比较。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号