首页> 外文期刊>IEEE Transactions on Neural Networks >A Class of Complex ICA Algorithms Based on the Kurtosis Cost Function
【24h】

A Class of Complex ICA Algorithms Based on the Kurtosis Cost Function

机译:基于峰度代价函数的一类复杂ICA算法

获取原文
获取原文并翻译 | 示例

摘要

In this paper, we introduce a novel way of performing real-valued optimization in the complex domain. This framework enables a direct complex optimization technique when the cost function satisfies the Brandwood''s independent analyticity condition. In particular, this technique has been used to derive three algorithms, namely, kurtosis maximization using gradient update (KM-G), kurtosis maximization using fixed-point update (KM-F), and kurtosis maximization using Newton update (KM-N), to perform the complex independent component analysis (ICA) based on the maximization of the complex kurtosis cost function. The derivation and related analysis of the three algorithms are performed in the complex domain without using any complex-real mapping for differentiation and optimization. A general complex Newton rule is also derived for developing the KM-N algorithm. The real conjugate gradient algorithm is extended to the complex domain similar to the derivation of complex Newton rule. The simulation results indicate that the fixed-point version (KM-F) and gradient version (KM-G) are superior to other similar algorithms when the sources include both circular and noncircular distributions and the dimension is relatively high.
机译:在本文中,我们介绍了一种在复杂域中执行实值优化的新颖方法。当成本函数满足Brandwood的独立分析条件时,此框架将启用直接复杂的优化技术。特别是,此技术已用于推导三种算法,即使用梯度更新的峰度最大化(KM-G),使用定点更新的峰度最大化(KM-F)和使用牛顿更新的峰度最大化(KM-N)。 ,以便基于复杂峰度成本函数的最大化执行复杂独立成分分析(ICA)。这三种算法的推导和相关分析是在复杂域中执行的,而无需使用任何复杂实数映射进行区分和优化。还导出了通用的牛顿规则来开发KM-N算法。类似于共轭牛顿法则的推导,将实际共轭梯度算法扩展到了复数域。仿真结果表明,当源同时包含圆形和非圆形分布且维数较大时,定点形式(KM-F)和梯度形式(KM-G)优于其他类似算法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号