首页> 外文会议>International Conference on Artificial Intelligence: Applications and Innovations >Domain Specific word Embedding Matrix for Training Neural Networks
【24h】

Domain Specific word Embedding Matrix for Training Neural Networks

机译:训练神经网络的特定领域词嵌入矩阵

获取原文

摘要

The text represents one of the most widespread sequential models and as such is well suited to the application of deep learning models from sequential data. Deep learning through natural language processing is pattern recognition, applied to words, sentences, and paragraphs. This study describes the process of creating a pre-trained word embeddings matrix and its subsequent use in various neural network models for the purposes of domain-specific texts classification. Embedding words is one of the popular ways to associate vectors with words. Creating a word embedding matrix maps imply well semantic relationship between words, which can vary from task to task.
机译:文本代表了最广泛的顺序模型之一,因此非常适合于从顺序数据中应用深度学习模型。通过自然语言处理进行的深度学习是模式识别,应用于单词,句子和段落。这项研究描述了创建预训练词嵌入矩阵的过程,以及随后在各种神经网络模型中用于特定领域文本分类的目的。嵌入单词是将向量与单词相关联的流行方法之一。创建单词嵌入矩阵映射意味着单词之间的语义关系很好,这可能因任务而异。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号