首页> 外文会议>IEEE International Conference on Acoustics, Speech and Signal Processing >A Gaussian Mixture Model layer jointly optimized with discriminative features within a Deep Neural Network architecture
【24h】

A Gaussian Mixture Model layer jointly optimized with discriminative features within a Deep Neural Network architecture

机译:深度神经网络架构中的判别特征共同优化的高斯混合模型层

获取原文
获取外文期刊封面目录资料

摘要

This article proposes and evaluates a Gaussian Mixture Model (GMM) represented as the last layer of a Deep Neural Network (DNN) architecture and jointly optimized with all previous layers using Asynchronous Stochastic Gradient Descent (ASGD). The resulting “Deep GMM” architecture was investigated with special attention to the following issues: (1) The extent to which joint optimization improves over separate optimization of the DNN-based feature extraction layers and the GMM layer; (2) The extent to which depth (measured in number of layers, for a matched total number of parameters) helps a deep generative model based on the GMM layer, compared to a vanilla DNN model; (3) Head-to-head performance of Deep GMM architectures vs. equivalent DNN architectures of comparable depth, using the same optimization criterion (frame-level Cross Entropy (CE)) and optimization method (ASGD); (4) Expanded possibilities for modeling offered by the Deep GMM generative model. The proposed Deep GMMs were found to yield Word Error Rates (WERs) competitive with state-of-the-art DNN systems, at the cost of pre-training using standard DNNs to initialize the Deep GMM feature extraction layers. An extension to Deep Subspace GMMs is described, resulting in additional gains.
机译:本文提出并评估了高斯混合模型(GMM),该模型表示为深度神经网络(DNN)体系结构的最后一层,并使用异步随机梯度下降(ASGD)与所有先前的层共同进行了优化。对产生的“深层GMM”体系结构进行了研究,并特别关注以下问题:(1)联合优化的程度比基于DNN的特征提取层和GMM层的单独优化有所改善; (2)与普通DNN模型相比,深度(以层数衡量,对于匹配的参数总数而言)对基于GMM层的深度生成模型的帮助程度; (3)使用相同的优化标准(帧级交叉熵(CE))和优化方法(ASGD),深度GMM架构与相当深度的等效DNN架构的头对头性能; (4)扩展了Deep GMM生成模型提供的建模可能性。发现拟议的Deep GMM可以产生与最新DNN系统相媲美的单词错误率(WER),但需要使用标准DNN进行预训练以初始化Deep GMM特征提取层,而产生误码率(WER)。描述了对深子空间GMM的扩展,从而带来了更多收益。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号