首页> 外文会议>IEEE International Conference on Acoustics, Speech and Signal Processing >Regularization of context-dependent deep neural networks with context-independent multi-task training
【24h】

Regularization of context-dependent deep neural networks with context-independent multi-task training

机译:与上下文无关的多任务训练对上下文相关的深度神经网络进行正则化

获取原文

摘要

The use of context-dependent targets has become standard in hybrid DNN systems for automatic speech recognition. However, we argue that despite the use of state-tying, optimising to context-dependent targets can lead to over-fitting, and that discriminating between arbitrary tied context-dependent targets may not be optimal. We propose a multitask learning method where the network jointly predicts context-dependent and monophone targets. We evaluate the method on a large-vocabulary lecture recognition task and show that it yields relative improvements of 3–10% over baseline systems.
机译:上下文依赖目标的使用已成为用于自动语音识别的混合DNN系统的标准。但是,我们认为,尽管使用了状态绑定,但优化上下文相关目标可能会导致过度拟合,并且区分任意绑定的上下文相关目标可能不是最佳方法。我们提出了一种多任务学习方法,其中网络可以共同预测上下文相关的目标和单音目标。我们在大词汇量的演讲识别任务上评估了该方法,并表明与基线系统相比,该方法可产生3-10%的相对改进。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号