针对语音识别中 DBN-DNN 训练时间过长的问题,提出了一种 DBN-DNN 网络的快速训练方法。该方法从减少误差反向传播计算量的角度出发,在更新网络参数时,通过交替变换网络更新层数来实现加速;同时,也设计了逐渐减少网络全局更新频率和逐渐减少网络更新层数两种实施策略。这种训练方法可以与多种 DNN加速训练算法相结合。实验结果表明,在不影响识别率的前提下,该方法独立使用或与随机数据筛选(stochastic data sweeping,SDS)算法、ASGD 算法等 DNN 加速训练算法相结合,都可以取得较为理想的加速结果。%As for the problem of too long training time of DBN-DNN in speech recognition,this paper proposed a fast training method to solve it.From the view of reducing the error back propagation calculation,this method achieved acceleration by alter-nating update layers.In the paper,it also designed two kinds of implementation strategy,i.e.shrinking global update frequency (SGUF)and shrinking partial update layer(SPUL).The method could be combined with a variety of fast training algorithm for DNN.The experimental results show that by this method independently or by the combination with Stochastic Data Sweeping (SDS)or ASGD algorithm,training time will be reduced dramatically at no loss of recognition accuracy.
展开▼