...
首页> 外文期刊>IEEJ Transactions on Electrical and Electronic Engineering >Gradient descent learning rule for complex-valued associative memories with large constant terms
【24h】

Gradient descent learning rule for complex-valued associative memories with large constant terms

机译:梯度下降学习规则,用于具有较大恒定条件的复杂价值的关联记忆

获取原文
获取原文并翻译 | 示例
   

获取外文期刊封面封底 >>

       

摘要

Complex-valued associative memories (CAMs) are one of the most promising associative memory models by neural networks. However, the low noise tolerance of CAMs is often a serious problem. A projection learning rule with large constant terms improves the noise tolerance of CAMs. However, the projection learning rule can be applied only to CAMs with full connections. In this paper, we propose a gradient descent learning rule with large constant terms, which is not restricted by network topology. We realize large constant terms by regularization to connection weights. By computer simulations, we prove that the proposed learning algorithm improves noise tolerance. (c) 2016 Institute of Electrical Engineers of Japan. Published by John Wiley & Sons, Inc.
机译:复杂价值的关联记忆(CAM)是神经网络最有希望的关联记忆模型之一。 但是,凸轮的低噪声耐受性通常是一个严重的问题。 具有较大恒定项的投影学习规则可提高凸轮的噪声耐受性。 但是,投影学习规则只能应用于具有完整连接的凸轮。 在本文中,我们提出了一项梯度下降学习规则,该规则不变,这不受网络拓扑的限制。 我们通过正规化连接权重实现了大型恒定项。 通过计算机模拟,我们证明了所提出的学习算法可提高噪声耐受性。 (c)2016年日本电气工程师研究所。 由John Wiley&Sons,Inc。出版

著录项

相似文献

  • 外文文献
  • 中文文献
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号