首页> 外文期刊>Neural Networks, IEEE Transactions on >A Novel Recurrent Neural Network With One Neuron and Finite-Time Convergence for -Winners-Take-All Operation
【24h】

A Novel Recurrent Neural Network With One Neuron and Finite-Time Convergence for -Winners-Take-All Operation

机译:一种新颖的具有单神经元和有限时间收敛性的递归神经网络,用于“赢者通吃”运算

获取原文
获取原文并翻译 | 示例

摘要

In this paper, based on a one-neuron recurrent neural network, a novel $k$-winners-take-all ($k$ -WTA) network is proposed. Finite time convergence of the proposed neural network is proved using the Lyapunov method. The $k$-WTA operation is first converted equivalently into a linear programming problem. Then, a one-neuron recurrent neural network is proposed to get the $k$th or $(k+1)$th largest inputs of the $k$-WTA problem. Furthermore, a $k$-WTA network is designed based on the proposed neural network to perform the $k$-WTA operation. Compared with the existing $k$-WTA networks, the proposed network has simple structure and finite time convergence. In addition, simulation results on numerical examples show the effectiveness and performance of the proposed $k$-WTA network.
机译:本文基于单神经元递归神经网络,提出了一种新颖的$ k $-赢家通吃($ k $ -WTA)网络。利用Lyapunov方法证明了所提出的神经网络的有限时间收敛性。 $ k $ -WTA操作首先等效转换为线性规划问题。然后,提出了一个单神经元递归神经网络来获得$ k $ -WTA问题的第kk个或第(k + 1)$个最大输入。此外,基于所提出的神经网络设计了$ k $ -WTA网络,以执行$ k $ -WTA操作。与现有的$ k $ -WTA网络相比,该网络结构简单,时间收敛性好。此外,数值例子的仿真结果表明了所提出的$ k $ -WTA网络的有效性和性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号