首页> 外文会议>IAPR International Conference on Document Analysis and Recognition >A Compact CNN-DBLSTM Based Character Model for Offline Handwriting Recognition with Tucker Decomposition
【24h】

A Compact CNN-DBLSTM Based Character Model for Offline Handwriting Recognition with Tucker Decomposition

机译:基于塔克分解的紧凑型CNN-DBLSTM字符模型用于离线手写识别

获取原文

摘要

Recently, character model based on integrated convolutional neural network (CNN) and deep bidirectional long short-term memory (DBLSTM) has achieved excellent performance for offline handwriting recognition (HWR). To deploy CNN-DBLSTM model in products, it is necessary to reduce the footprint and runtime latency as much as possible. In this paper, we study two methods to compress the CNN part: (1) Use Tucker decomposition to decompose pre-trained weights with low-rank approximation, followed by fine-tuning; (2) Use grouped convolution to construct sparse connections in channel domain. Experiments have been conducted on a large-scale offline English HWR task to compare the effectiveness of the above two techniques. Our results show that using Tucker decomposition alone offers a good solution to building a compact CNN-DBLSTM model which can reduce significantly both the footprint and latency yet without degrading recognition accuracy.
机译:最近,基于集成卷积神经网络(CNN)和深度双向长短期记忆(DBLSTM)的字符模型在脱机手写识别(HWR)方面已取得了出色的性能。要在产品中部署CNN-DBLSTM模型,必须尽可能减少占用空间和运行时延迟。在本文中,我们研究了两种压缩CNN部分的方法:(1)使用Tucker分解以低秩近似分解预训练权重,然后进行微调; (2)使用分组卷积在通道域中构造稀疏连接。已对大型离线英语HWR任务进行了实验,以比较上述两种技术的有效性。我们的结果表明,单独使用Tucker分解可为构建紧凑的CNN-DBLSTM模型提供良好的解决方案,该模型可以显着减少占用空间和延迟,同时又不会降低识别精度。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号