首页> 外文会议>International Symposium on Neural Networks >Robust Recursive TLS (Total Least Square) Method Using Regularized UDU Decomposed for FNN (Feedforward Neural Network) Training
【24h】

Robust Recursive TLS (Total Least Square) Method Using Regularized UDU Decomposed for FNN (Feedforward Neural Network) Training

机译:使用正则UDU分解FNN(前馈神经网络)培训的鲁棒递归TLS(总分比)方法

获取原文

摘要

We present a robust recursive total least squares (RRTLS) algorithm for multilayer feed-forward neural networks. So far, recursive least squares (RLS) has been successfully applied to training multilayer feed-forward neural networks. However, if input data has additive noise, the results from RLS could be biased. Theoretically, such biased results can be avoided by using the recursive total least squares (RTLS) algorithm based on Power Method. In this approach, Power Method uses rank-1 update. and thus is apt to be in ill condition. In this paper, therefore, we propose a robust RTLS algorithm using regularized UDU factorization. This method gives better performance than RLS based training over a wide range of SNRs.
机译:我们为多层前馈神经网络呈现了一种强大的递归总量最小二乘(RRTLS)算法。到目前为止,递归最小二乘(RLS)已成功应用于训练多层前馈神经网络。但是,如果输入数据具有附加噪声,则可以偏置RLS的结果。从理论上讲,通过使用基于功率方法的递归总量(RTLS)算法,可以避免这种偏置结果。在这种方法中,Power方法使用Rank-1更新。因此,易于状况。因此,在本文中,我们提出了一种使用正则化UDU分解的鲁棒RTLS算法。该方法比基于广泛的SNRS培训提供更好的性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号