【24h】

Towards Shallow Semantics: The OntoNotes Project

机译:迈向浅语义:OntoNotes项目

获取原文

摘要

Many natural language processing (NLP) applications could benefit from a richer model of text meaning than the bag-of-words and n-gram models that currently predominate. Despite theoretical interest since the 1960s, however, no large-scale model exists; in fact, it is not even clear what such a model should minimally include. However, the introduction of large-scale public resources such as the Penn TreeBank and WordNet have generated a great deal of progress in the NLP community, and so it seems increasingly important to create some kind of meaning-oriented model and build a corresponding corpus that is large enough to support adequate machine learning.
机译:许多自然语言处理(NLP)应用程序可以受益于比当前流行的词袋和n-gram模型更丰富的文本含义模型。尽管从1960年代开始就有了理论上的兴趣,但是,还没有大规模的模型。实际上,甚至不清楚这种模型应最少包含哪些内容。但是,诸如Penn TreeBank和WordNet之类的大规模公共资源的引入在NLP社区中产生了很大的进步,因此创建某种面向意义的模型并建立相应的语料库似乎越来越重要。足以支持足够的机器学习。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号