首页> 外文会议>Annual Meeting of the Association for Computational Linguistics;International Joint Conference on natural Language Processing >Don't Let Discourse Confine Your Model: Sequence Perturbations for Improved Event Language Models
【24h】

Don't Let Discourse Confine Your Model: Sequence Perturbations for Improved Event Language Models

机译:不要让话语限制您的模型:改进事件语言模型的序列扰动

获取原文

摘要

Event language models represent plausible sequences of events. Most existing approaches train autoregressive models on text, which successfully capture event co-occurrence but unfortunately constrain the model to follow the discourse order in which events are presented. Other domains may employ different discourse orders, and for many applications, we may care about different notions of ordering (e.g., temporal) or not care about ordering at all (e.g., when predicting related events in a schema). We propose a simple yet surprisingly effective strategy for improving event language models by perturbing event sequences so we can relax model dependence on text order. Despite generating completely synthetic event orderings, we show that this technique improves the performance of the event language models on both applications andout-of-domain events data.
机译:事件语言模型代表似乎的事件序列。 大多数现有方法在文本上培训自动评级模型,成功捕获事件共同发生,但不幸的是限制模型以遵循呈现事件的话语命令。 其他域可以采用不同的话语订单,以及许多应用程序,我们可能会关心排序(例如,时间)的不同概念,或者在所有情况下都不关心(例如,当在模式中预测相关事件时)。 我们提出了一种简单但令人惊讶的有效策略,可以通过扰动事件序列来改进事件语言模型,因此我们可以放松模型依赖性对文本顺序。 尽管有完全合成的事件排序,但我们表明该技术可以提高事件语言模型对域中的应用程序和域事件数据的性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号