首页> 外文会议>Annual meeting of the Association for Computational Linguistics >Language Production Dynamics with Recurrent Neural Networks
【24h】

Language Production Dynamics with Recurrent Neural Networks

机译:具有经常性神经网络的语言生产动态

获取原文

摘要

We present an analysis of the internal mechanism of the recurrent neural model of sentence production presented by Calvillo et al. (2016). The results show clear patterns of computation related to each layer in the network allowing to infer an algorithmic account, where the semantics activates the semantically related words, then each word generated at each time step activates syntactic and semantic constraints on possible continuations, while the recurrence preserves information through time. We propose that such insights could generalize to other models with similar architecture, including some used in computational linguistics for language modeling, machine translation and image caption generation.
机译:我们对Calvillo等人呈现的句子生产经常性神经模型的内部机制分析。 (2016)。结果显示了与网络中的每个层相关的计算模式,允许推断语义激活语义相关的单词,然后在每个时间步骤产生的每个单词激活可能持续的句法和语义约束,同时复发通过时间保留信息。我们建议这样的见解可以推广到具有类似架构的其他模型,包括用于语言建模,机器转换和图像标题的计算语言学中的一些。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号