【24h】

Probing sentence embeddings for structure-dependent tense

机译:探索与结构相关的时态的句子嵌入

获取原文
获取原文并翻译 | 示例

摘要

Learning universal sentence representations which accurately model sentential semantic content is a current goal of natural language processing research (Subramanian et al., 2018; Conneau et al., 2017; Wieting et al., 2016; Kiros et al., 2015). A prominent and successful approach is to train recurrent neural networks (RNNs) to encode sentences into fixed length vectors (Conneau et al., 2018; Nie et al., 2017). Many core linguistic phenomena that one would like to model in universal sentence representations depend on syntactic structure (Chomsky, 1965; Everaert et al., 2015). Despite the fact that RNNs do not have explicit syntactic structural representations, there is some evidence that RNNs can approximate such structure-dependent phenomena under certain conditions (Gulordava et al., 2018; McCoy et al., 2018; Linzen et al., 2016; Bowman et al., 2015), in addition to their widespread success in practical tasks.
机译:学习能够准确模拟句子语义内容的通用句子表示形式是自然语言处理研究的当前目标(Subramanian等人,2018; Conneau等人,2017; Wieting等人,2016; Kiros等人,2015)。一种突出且成功的方法是训练递归神经网络(RNN)以将句子编码为固定长度的向量(Conneau等人,2018; Nie等人,2017)。人们希望在通用句子表示中建模的许多核心语言现象都取决于句法结构(Chomsky,1965; Everaert等人,2015)。尽管RNN没有明确的句法结构表示形式,但仍有一些证据表明RNN可以在特定条件下近似此类依赖于结构的现象(Gulordava等人,2018; McCoy等人,2018; Linzen等人,2016 ; Bowman等人,2015年),以及他们在实际任务中的广泛成功。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号