首页>
外文OA文献
>Solving the local minimum and flat-spot problem by modifying wrong outputs for feed-forward neural networks
【2h】
Solving the local minimum and flat-spot problem by modifying wrong outputs for feed-forward neural networks
展开▼
机译:通过修改前馈神经网络的错误输出来解决局部最小和平坦问题
展开▼
免费
页面导航
摘要
著录项
引文网络
相似文献
相关主题
摘要
Backpropagation (BP) algorithm, which is very popular in supervised learning, is extensively applied in training feed-forward neural networks. Many modifications have been proposed to speed up the convergence process of the standard BP algorithm. However, they seldom focus on improving the global convergence capability. This paper proposes a new algorithm called Wrong Output Modification (WOM) to improve the global convergence capability of a fast learning algorithm. When a learning process is trapped by a local minimum or a flat-spot area, this algorithm looks for some outputs that go to other extremes when compared with their target outputs, and then it modifies such outputs systemically so that they can get close to their target outputs and hence some weights of neurons are changed accordingly. It is hoped that these changes make the learning process escape from such local minima or flat-spot areas and then converge. The performance investigation shows that the proposed algorithm can be applied into different fast learning algorithms, and their global convergence capabilities are improved significantly compared with their original algorithms. Moreover, some statistical data obtained from this algorithm can be used to identify the difficulty of a learning problem.
展开▼