首页> 外文会议>Conference on empirical methods in natural language processing >Speed Reading: Learning to Read ForBackward via Shuttle
【24h】

Speed Reading: Learning to Read ForBackward via Shuttle

机译:速读:通过班车学习向后阅读

获取原文

摘要

We present LSTM-Shuttle. which applies human speed reading techniques to natural language processing tasks for accurate and efficient comprehension. In contrast to previous work, LSTM-Shuttle not only reads shuttling forward but also goes back. Shuttling forward enables high efficiency, and going backward gives the model a chance to recover lost information, ensuring better prediction. We evaluate LSTM-Shuttle on sentiment analysis, news classification, and cloze on IMDB, Rotten Tomatoes, AG, and Children's Book Test datasets. We show that LSTM-Shuttle predicts both better and more quickly. To demonstrate how LSTM-Shuttle actually behaves, we also analyze the shuttling operation and present a case study.
机译:我们介绍LSTM-Shuttle。将人类快速阅读技术应用于自然语言处理任务,以实现准确而有效的理解。与以前的工作相反,LSTM-Shuttle不仅向前读取穿梭内容,而且向后看。向前穿梭可提高效率,向后穿梭可使模型有机会恢复丢失的信息,从而确保更好的预测。我们在情感分析,新闻分类和IMDB,烂番茄,AG和儿童图书测试数据集上的完形填空方面评估LSTM-Shuttle。我们表明,LSTM-Shuttle可以更好,更快地进行预测。为了演示LSTM-Shuttle的实际行为,我们还分析了穿梭操作并进行了案例研究。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号