【24h】

A preliminary study on cross-language knowledge transfer for low-resource Taiwanese Mandarin ASR

机译:低资源台湾普通话跨语言知识转移初探

获取原文

摘要

The deep neural networks (DNNs) are the state-of-the-art automatic speech recognition (ASR) technique nowadays. However, the key to success is that a large amount of speech data of the target language is required to well train DNNs. Unfortunately, there are only few small Taiwanese Mandarin Speech corpora available in Taiwan. Therefore, in this paper, two cross-language knowledge transfer approaches are evaluated for building a high performance Taiwanese Mandarin ASR including (1) a borrowed-hidden-layer and (2) a shared-hidden-layer method. Experimental results show that (1) the shared-hidden-layer method achieved the best performance and (2) the system is robustness to different speaker and phone.
机译:深度神经网络(DNN)现在是最先进的自动语音识别(ASR)技术。然而,成功的关键是,需要大量的目标语言语音数据来训练DNN。不幸的是,台湾只有很少的小台湾普通话语音集团。因此,在本文中,评估了两个跨语言知识转移方法,用于构建高性能台湾普通话ASR,包括(1)借用隐藏层和(2)共享隐藏层方法。实验结果表明,(1)共享隐藏层方法实现了最佳性能和(2)系统对不同扬声器和手机的鲁棒性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号