首页> 外文会议>International Workshop on Applied Parallel Computing >A High Parallel Procedure to Initialize the Output Weights of a Radial Basis Function or BP Neural Network
【24h】

A High Parallel Procedure to Initialize the Output Weights of a Radial Basis Function or BP Neural Network

机译:一个高并行过程,可初始化径向基函数或BP神经网络的输出权重

获取原文

摘要

The training of a neural network can be made using many different procedures; they allow to find the weights that minimize the discrepancies between targets and actual outputs of the network. The optimal weights can be found either in a direct way or using iterative techniques; in both cases it's sometimes necessary (or simply useful) to evaluate the pseudo-inverse matrix of the projections of input examples into the function space created by the network. Every operation we have to perform to do this can however become difficult (and sometimes impossible) when the dimension of this matrix is very large, so we deal with a way to subdivide it and to obtain our aim by a high parallel algorithm
机译:可以使用许多不同的程序进行神经网络的训练;它们允许找到最小化目标与网络实际输出之间的差异的权重。最佳的重量可以直接或使用迭代技术找到;在这两种情况下,它有时需要(或简单有用)来评估输入示例的投影的伪逆矩阵到网络创建的函数空间中。然而,当该矩阵的维度非常大时,我们必须执行此操作的每一个操作都可以变得困难(有时是不可能的),因此我们处理一种细分的方法,并通过高行算法获得我们的宗旨

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号