首页> 外文会议>Proceedings of the 21st International Conference on Pattern Recognition. >Long-short term memory neural networks language modeling for handwriting recognition
【24h】

Long-short term memory neural networks language modeling for handwriting recognition

机译:用于手写识别的长期记忆神经网络语言建模

获取原文
获取原文并翻译 | 示例

摘要

Unconstrained handwritten text recognition systems maximize the combination of two separate probability scores. The first one is the observation probability that indicates how well the returned word sequence matches the input image. The second score is the probability that reflects how likely a word sequence is according to a language model. Current state-of-the-art recognition systems use statistical language models in form of bigram word probabilities. This paper proposes to model the target language by means of a recurrent neural network with long-short term memory cells. Because the network is recurrent, the considered context is not limited to a fixed size especially as the memory cells are designed to deal with long-term dependencies. In a set of experiments conducted on the IAM off-line database we show the superiority of the proposed language model over statistical n-gram models.
机译:无约束的手写文本识别系统将两个单独的概率分数的组合最大化。第一个是观察概率,它指示返回的单词序列与输入图像的匹配程度。第二得分是反映根据语言模型的单词序列可能性的概率。当前最先进的识别系统以双字词概率形式使用统计语言模型。本文提出通过具有长期短期记忆细胞的递归神经网络对目标语言进行建模。因为网络是经常性的,所以考虑的上下文不限于固定大小,尤其是在设计存储单元来处理长期依赖性时。在IAM离线数据库上进行的一组实验中,我们证明了所提出的语言模型优于统计n-gram模型的优势。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号