首页> 外文会议>International Conference on Pattern Recognition >Self-Attention Based Network for Punctuation Restoration
【24h】

Self-Attention Based Network for Punctuation Restoration

机译:基于自我关注的标点恢复网络

获取原文

摘要

Inserting proper punctuation into Automatic Speech Recognizer(ASR) transcription is a challenging and promising task in real-time Spoken Language Translation(SLT). Traditional methods built on the sequence labelling framework are weak in handling the joint punctuation. To tackle this problem, we propose a novel self-attention based network, which can solve the aforementioned problem very well. In this work, a light-weight neural net is proposed to extract the hidden features based solely on self-attention without any Recurrent Neural Nets(RNN) and Convolutional Neural Nets(CNN). We conduct extensive experiments on complex punctuation tasks. The experimental results show that the proposed model achieves significant improvements on joint punctuation task while being superior to traditional methods on simple punctuation task as well.
机译:将适当的标点符号插入自动语音识别器(ASR)转录是实时口语翻译(SLT)的具有挑战性和有希望的任务。在序列标记框架上构建的传统方法在处理联合标点时弱。为了解决这个问题,我们提出了一种新的基于自我关注的网络,可以很好地解决上述问题。在这项工作中,提出了一种轻质神经网络,以完全基于没有任何复发性神经网(RNN)和卷积神经网(CNN)的自我关注来提取隐藏特征。我们对复杂标点任务进行广泛的实验。实验结果表明,该建议的模型实现了对联合标点任务任务的显着改进,同时优于在简单标点任务上的传统方法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号