首页> 美国政府科技报告 >Simple 'Linearized' Learning Algorithm Which Outperforms Back-Propagation
【24h】

Simple 'Linearized' Learning Algorithm Which Outperforms Back-Propagation

机译:简单的“线性化”学习算法优于反向传播

获取原文

摘要

A class of algorithms is presented for training multilayer perceptrons usingpurely linear techniques. The methods are based upon linearizations of the network using error surface analysis, followed by a contemporary least squares estimation procedure. Specific algorithms are presented to estimate weights node-wise, layer-wise, and for estimating the entire set of network weights simultaneously. In several experimental studies, the node-wise method is superior to back-propagation and an alternative linearization method due to Azimi-Sadjadi et al. in terms of number of convergences and convergence rate. The layer and network-wise updating offer further improvement.

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号