...
首页> 外文期刊>IEEE Transactions on Neural Networks >A class of competitive learning models which avoids neuron underutilization problem
【24h】

A class of competitive learning models which avoids neuron underutilization problem

机译:一类竞争学习模型,可避免神经元利用不足问题

获取原文
获取原文并翻译 | 示例
           

摘要

We study a qualitative property of a class of competitive learning (CL) models, which is called the multiplicatively biased competitive learning (MBCL) model, namely that it avoids neuron underutilization with probability one as time goes to infinity. In the MBCL, the competition among neurons is biased by a multiplicative term, while only one weight vector is updated per learning step. This is of practical interest since its instances have computational complexities among the lowest in existing CL models. In addition, in applications like classification, vector quantizer design and probability density function estimation, a necessary condition for optimal performance is to avoid neuron underutilization. Hence, it is possible to define instances of MBCL to achieve optimal performance in these applications.
机译:我们研究了一类竞争性学习(CL)模型的质性,该模型称为乘性有偏竞争性学习(MBCL)模型,即随着时间趋于无穷大,它避免了神经元利用不足的可能性。在MBCL中,神经元之间的竞争受到乘法项的偏见,而每个学习步骤仅更新一个权重向量。这具有实际意义,因为其实例在现有CL模型中具有最低的计算复杂度。此外,在分类,矢量量化器设计和概率密度函数估计等应用中,实现最佳性能的必要条件是避免神经元利用不足。因此,可以定义MBCL实例以在这些应用程序中实现最佳性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号