...
首页> 外文期刊>IEEE transactions on systems, man, and cybernetics. Part B >The local minima-free condition of feedforward neural networks for outer-supervised learning
【24h】

The local minima-free condition of feedforward neural networks for outer-supervised learning

机译:外监督学习的前馈神经网络的局部无极小条件

获取原文
获取原文并翻译 | 示例
           

摘要

In this paper, the local minima-free conditions of the outer-supervised feedforward neural networks (FNN) based on batch-style learning are studied by means of the embedded subspace method. It is proven that only if the rendition that the number of the hidden neurons is not less than that of the training samples, which is sufficient but not necessary, is satisfied, the network will necessarily converge to the global minima with null cost, and that the condition that the range space of the outer-supervised signal matrix is included in the range space of the hidden output matrix Is sufficient and necessary condition for the local minima-free in the error surface. In addition, under the condition of the number of the hidden neurons being less than that of the training samples and greater than the number of the output neurons, it is demonstrated that there will also only exist the global minima with null cost in the error surface if the first layer weights are adequately selected.
机译:本文利用嵌入式子空间方法研究了基于批处理学习的外部监督前馈神经网络(FNN)的局部极小条件。事实证明,只有满足隐藏神经元的数量不少于训练样本的数量(足够但不是必需的)的前提,网络才能以零成本收敛到全局最小值。将外部监督信号矩阵的范围空间包括在隐藏输出矩阵的范围空间中的条件对于误差表面中的局部无极小值是充分必要的条件。此外,在隐藏神经元数量少于训练样本数量且大于输出神经元数量的情况下,证明在误差表面上也将只存在具有零成本的全局极小值。如果第一层权重被适当选择。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号