首页> 外文会议>Data Compression Conference >An Empirical Analysis of Recurrent Learning Algorithms in Neural Lossy Image Compression Systems
【24h】

An Empirical Analysis of Recurrent Learning Algorithms in Neural Lossy Image Compression Systems

机译:神经损耗图像压缩系统中经常学习算法的实证分析

获取原文

摘要

Recent advances in deep learning have resulted in image compression algorithms that outperform JPEG and JPEG 2000 on the standard Kodak benchmark. However, they are slow to train (due to backprop-through-time) and, to the best of our knowledge, have not been systematically evaluated on a large variety of datasets. In this paper, we perform the first large scale comparison of recent state-of-the-art hybrid neural compression algorithms, while exploring the effects alternative training strategies (when applicable). The hybrid recurrent neural decoder is a former state-of-the-art model (recently overtaken by a Google model) that can be trained using backprop-through-time (BPTT) or with alternative algorithms like sparse attentive backtracking (SAB), unbiased online recurrent optimization (UORO), and real time recurrent learning (RTRL). We compare these training alternatives along with the Google models (GOOG and E2E) on 6 benchmark datasets. Surprisingly, we found that the model trained with SAB performs the better (outperforming even BPTT), resulting in faster convergence and better peak signal-to-noise ratio.
机译:深度学习的最新进展导致了图像压缩算法,以标准的柯达基准测试越优于JPEG和JPEG 2000。但是,他们慢慢训练(由于背负时间),并且据我们所知,尚未在各种各样的数据集上系统地评估。在本文中,我们执行最近最先进的混合神经压缩算法的第一个大规模比较,同时探索替代培训策略(适用时)。混合复发性神经解码器是前一种最先进的模型(最近被谷歌模型所超越),可以使用反向通孔(BPTT)或使用稀疏的细分返回(SAB)等替代算法来训练,例如稀疏的算法,无偏见在线复发优化(Uoro),实时复发学习(RTRL)。我们将这些培训替代品以及在6个基准数据集中的谷歌模型(GOOG和E2E)进行比较。令人惊讶的是,我们发现用SAB培训的模型表现更好(甚至均匀的BPTT),导致更快的收敛性和更好的峰值信噪比。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号