首页> 外文期刊>Neural Networks and Learning Systems, IEEE Transactions on >Fractional Hopfield Neural Networks: Fractional Dynamic Associative Recurrent Neural Networks
【24h】

Fractional Hopfield Neural Networks: Fractional Dynamic Associative Recurrent Neural Networks

机译:分数Hopfield神经网络:分数动态联想递归神经网络

获取原文
获取原文并翻译 | 示例

摘要

This paper mainly discusses a novel conceptual framework: fractional Hopfield neural networks (FHNN). As is commonly known, fractional calculus has been incorporated into artificial neural networks, mainly because of its long-term memory and nonlocality. Some researchers have made interesting attempts at fractional neural networks and gained competitive advantages over integer-order neural networks. Therefore, it is naturally makes one ponder how to generalize the first-order Hopfield neural networks to the fractional-order ones, and how to implement FHNN by means of fractional calculus. We propose to introduce a novel mathematical method: fractional calculus to implement FHNN. First, we implement fractor in the form of an analog circuit. Second, we implement FHNN by utilizing fractor and the fractional steepest descent approach, construct its Lyapunov function, and further analyze its attractors. Third, we perform experiments to analyze the stability and convergence of FHNN, and further discuss its applications to the defense against chip cloning attacks for anticounterfeiting. The main contribution of our work is to propose FHNN in the form of an analog circuit by utilizing a fractor and the fractional steepest descent approach, construct its Lyapunov function, prove its Lyapunov stability, analyze its attractors, and apply FHNN to the defense against chip cloning attacks for anticounterfeiting. A significant advantage of FHNN is that its attractors essentially relate to the neuron's fractional order. FHNN possesses the fractional-order-stability and fractional-order-sensitivity characteristics.
机译:本文主要讨论了一个新颖的概念框架:分数Hopfield神经网络(FHNN)。众所周知,分数演算已被并入人工神经网络,这主要是由于其长期记忆和非局部性。一些研究人员对分数神经网络进行了有趣的尝试,并获得了优于整数阶神经网络的竞争优势。因此,自然而然地会思考如何将一阶Hopfield神经网络泛化为分数阶神经网络,以及如何通过分数微积分实现FHNN。我们建议引入一种新颖的数学方法:分数微积分以实现FHNN。首先,我们以模拟电路的形式实现分形。其次,我们利用分数线和分数最速下降法实现FHNN,构造其Lyapunov函数,并进一步分析其吸引子。第三,我们进行了实验以分析FHNN的稳定性和收敛性,并进一步讨论了其在防芯片克隆攻击防御中的应用。我们工作的主要贡献是通过利用分波和分数最陡下降法以模拟电路的形式提出FHNN,构造其Lyapunov函数,证明其Lyapunov稳定性,分析其吸引子,并将FHNN应用于芯片防御复制防伪攻击。 FHNN的一个显着优点是其吸引子与神经元的分数阶本质相关。 FHNN具有分数阶稳定性和分数阶灵敏度特性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号