首页> 外文会议>IEEE International Conference on Safety Produce Informatization >An Improved LSTM Structure for Natural Language Processing
【24h】

An Improved LSTM Structure for Natural Language Processing

机译:一种改进的自然语言处理LSTM结构

获取原文

摘要

Natural language processing technology is widely used in artificial intelligence fields such as machine translation, human-computer interaction and speech recognition. Natural language processing is a daunting task due to the variability, ambiguity and context-dependent interpretation of human language. The current deep learning technology has made great progress in NLP technology. However, many NLP systems still have practical problems, such as high training complexity, computational difficulties in large-scale content scenarios, high retrieval complexity and lack of probabilistic significance. This paper proposes an improved NLP method based on long short-term memory (LSTM) structure, whose parameters are randomly discarded when they are passed backwards in the recursive projection layer. Compared with baseline and other LSTM, the improved method has better F1 score results on the Wall Street Journal dataset, including the word2vec word vector and the one-hot word vector, which indicates that our method is more suitable for NLP in limited computing resources and high amount of data.
机译:自然语言加工技术广泛用于人工智能领域,如机器翻译,人机交互和语音识别。由于人类语言的变异性,歧义和上下文依赖性解释,自然语言处理是一种艰巨的任务。目前的深度学习技术在NLP技术方面取得了很大进展。然而,许多NLP系统仍然具有实际问题,例如高训练复杂性,大规模内容方案的计算困难,高检索复杂性和缺乏概率意义。本文提出了一种基于长短期存储器(LSTM)结构的改进的NLP方法,当它们在递归投影层向后被随机丢弃,当它们在递归投影层中时,其参数被随机丢弃。与基线和其他LSTM相比,改进的方法在墙街道日记数据集上具有更好的F1得分结果,包括Word2Vec Word Vector和一个热门字矢量,这表明我们的方法更适合于有限的计算资源中的NLP和高量数据。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号