首页> 外文会议>IEEE International Conference of Safety Produce Informatization >An Improved LSTM Structure for Natural Language Processing
【24h】

An Improved LSTM Structure for Natural Language Processing

机译:用于自然语言处理的改进的LSTM结构

获取原文
获取外文期刊封面目录资料

摘要

Natural language processing technology is widely used in artificial intelligence fields such as machine translation, human-computer interaction and speech recognition. Natural language processing is a daunting task due to the variability, ambiguity and context-dependent interpretation of human language. The current deep learning technology has made great progress in NLP technology. However, many NLP systems still have practical problems, such as high training complexity, computational difficulties in large-scale content scenarios, high retrieval complexity and lack of probabilistic significance. This paper proposes an improved NLP method based on long short-term memory (LSTM) structure, whose parameters are randomly discarded when they are passed backwards in the recursive projection layer. Compared with baseline and other LSTM, the improved method has better F1 score results on the Wall Street Journal dataset, including the word2vec word vector and the one-hot word vector, which indicates that our method is more suitable for NLP in limited computing resources and high amount of data.
机译:自然语言处理技术被广泛应用于人工智能领域,例如机器翻译,人机交互和语音识别。由于人类语言的可变性,歧义性和上下文相关的解释,自然语言处理是一项艰巨的任务。当前的深度学习技术在NLP技术中取得了长足的进步。但是,许多NLP系统仍然存在实际问题,例如训练复杂度高,大规模内容场景中的计算困难,检索复杂度高和缺乏概率意义。本文提出了一种基于长短期记忆(LSTM)结构的改进的NLP方法,该方法在递归投影层中向后传递参数时会随机丢弃这些参数。与基线和其他LSTM相比,改进后的方法在《华尔街日报》数据集(包括word2vec词向量和单热词向量)上具有更好的F1评分结果,这表明我们的方法更适合于有限的计算资源和NLP。大量数据。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号