首页> 外文期刊>系统科学与复杂性:英文版 >AN INFORMATION THEORETICAL APPROACH TO NEURAL NETWORKS
【24h】

AN INFORMATION THEORETICAL APPROACH TO NEURAL NETWORKS

机译:神经网络的信息理论方法

获取原文
获取原文并翻译 | 示例
       

摘要

The purpose of this paper is to present a unified theory of several differentneural networks that have been proposed for solving various computation, pattern recog-nition, imaging, optimization, and other problems. The functioning of these networks ischaracterized by Lyapunov energy functions. The relationship between the deterministicand stochastic neural networks is examined. The simulated annealing methods for findingthe global optimum of an objective function as well as their generalization by injectingnoise into deterministic neural networks are discussed. A statistical interpretation of thedynamic evolution of the different neural networks is presented. The problem of trainingdifferent neural networks is investigated in this general framework. It is shown how thisapproach can be used not only for analyzing various neural networks, but also for the choiceof the proper neural network for solving any given problem and the design of a trainingalgorithm for the particular neural network.
机译:本文的目的是提出了一种统一的几种族网络,已经提出用于解决各种计算,模式recog-nition,成像,优化等问题。这些网络的运作是由Lyapunov能量函数进行具体化的。检查了确定性和随机神经网络之间的关系。讨论了用于寻找目标函数的全局最佳的模拟退火方法以及通过注入Nemativernativer Neural网络的概念。提出了不同神经网络的动力演化的统计解释。在这一总体框架中调查了培训的神经网络问题。图3示出了如何使用如何用于分析各种神经网络,而是用于选择适当的神经网络,用于解决特定神经网络的任何给定问题和特定神经网络的培训算法的设计。

著录项

获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号