首页> 外文会议>International Joint Conference on Neural Networks >Words Are Not Temporal Sequences of Characters
【24h】

Words Are Not Temporal Sequences of Characters

机译:单词不是特征的时间序列

获取原文
获取外文期刊封面目录资料

摘要

Language modeling is a valuable component of generative natural language processing (NLP) tasks, and benefits from explicit representations of the inherent hierarchies in language. We investigate a commonly used architecture that captures the concept that words are built from characters, and modify the word encoding mechanism to use a feed forward neural network rather than a recurrent neural network (RNN). This feed forward architecture facilitates increased performance and a reduction in the number of parameters over models that use common RNN implementations. We investigate whether word representations benelit from position- invariant features in the characters, and lind that fixed-position representations are sufficient.
机译:语言建模是生成自然语言处理(NLP)任务的有价值的组成部分,以及语言中固有层次结构的显式表示的优势。我们调查常用的架构,该架构捕获捕获从字符构建的概念,并修改单词编码机制以使用馈送前进神经网络而不是经常性神经网络(RNN)。这种馈送前向架构有助于增加性能和使用常见RNN实现的模型的参数数量的减少。我们调查单词表示是否从字符中的位置不变特征中的字样,并将定位位置表示足够。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号