首页> 外文会议>Conference on empirical methods in natural language processing >Numerically Grounded Language Models for Semantic Error Correction
【24h】

Numerically Grounded Language Models for Semantic Error Correction

机译:用于语义错误校正的数字化语言模型

获取原文

摘要

Semantic error detection and correction is an important task for applications such as fact checking, speech-to-text or grammatical error correction. Current approaches generally focus on relatively shallow semantics and do not account for numeric quantities. Our approach uses language models grounded in numbers within the text. Such groundings are easily achieved for recurrent neural language model architectures, which can be further conditioned on incomplete background knowledge bases. Our evaluation on clinical reports shows that numerical grounding improves perplexity by 33% and F1 for semantic error correction by 5 points when compared to ungrounded approaches. Conditioning on a knowledge base yields further improvements.
机译:语义错误检测和纠正是诸如事实检查,语音到文本或语法错误纠正等应用程序中的重要任务。当前的方法通常集中在相对浅的语义上,并且不考虑数字量。我们的方法使用基于文本中数字的语言模型。对于循环神经语言模型体系结构来说,很容易获得这种基础,而这种体系结构可以进一步以不完整的背景知识为基础。我们对临床报告的评估表明,与无基础方法相比,数字基础使困惑度提高了33%,语义错误纠正的F1提高了5点。以知识库为条件可以带来进一步的改进。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号