首页> 外文期刊>Natural language engineering >Abbreviated text input using language modeling
【24h】

Abbreviated text input using language modeling

机译:使用语言建模的缩写文本输入

获取原文
获取原文并翻译 | 示例

摘要

We address the problem of improving the efficiency of natural language text input under degraded conditions (for instance, on mobile computing devices or by disabled users), by taking advantage of the informational redundancy in natural language. Previous approaches to this problem have been based on the idea of prediction of the text, but these require the user to take overt action to verify or select the system's predictions. We propose taking advantage of the duality between prediction and compression. We allow the user to enter text in compressed form, in particular, using a simple stipulated abbreviation method that reduces characters by 26.4%, yet is simple enough that it can be learned easily and generated relatively fluently. We decode the abbreviated text using a statistical generative model of abbreviation, with a residual word error rate of 3.3%. The chief component of this model is an n-gram language model. Because the system's operation is completely independent from the user's, the overhead from cognitive task switching and attending to the system's actions online is eliminated, opening up the possibility that the compression-based method can achieve text input efficiency improvements where the prediction-based methods have not. We report the results of a user study evaluating this method.
机译:我们利用自然语言中的信息冗余,解决了在退化条件下(例如,在移动计算设备上或由残障用户使用)提高自然语言文本输入效率的问题。解决该问题的先前方法已经基于文本预测的思想,但是这些方法要求用户采取明显的行动来验证或选择系统的预测。我们建议利用预测和压缩之间的对偶性。我们允许用户以压缩形式输入文本,特别是使用一种简单的规定的缩写方法,该方法可以将字符减少26.4%,但又足够简单,可以轻松学习并且相对流畅地生成。我们使用缩写的统计生成模型对缩写文本进行解码,残留单词错误率为3.3%。该模型的主要组成部分是n-gram语言模型。由于系统的操作完全独立于用户的操作,因此消除了认知任务切换和关注系统在线行为的开销,从而打开了基于压缩的方法可以提高文本输入效率的可能性,而基于预测的方法具有不。我们报告评估此方法的用户研究的结果。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号