首页> 美国卫生研究院文献>other >Convergence and Rate Analysis of Neural Networks for Sparse Approximation
【2h】

Convergence and Rate Analysis of Neural Networks for Sparse Approximation

机译:稀疏近似神经网络的收敛性和速率分析

代理获取
本网站仅为用户提供外文OA文献查询和代理获取服务,本网站没有原文。下单后我们将采用程序或人工为您竭诚获取高质量的原文,但由于OA文献来源多样且变更频繁,仍可能出现获取不到、文献不完整或与标题不符等情况,如果获取不到我们将提供退款服务。请知悉。

摘要

We present an analysis of the Locally Competitive Algotihm (LCA), which is a Hopfield-style neural network that efficiently solves sparse approximation problems (e.g., approximating a vector from a dictionary using just a few nonzero coefficients). This class of problems plays a significant role in both theories of neural coding and applications in signal processing. However, the LCA lacks analysis of its convergence properties, and previous results on neural networks for nonsmooth optimization do not apply to the specifics of the LCA architecture. We show that the LCA has desirable convergence properties, such as stability and global convergence to the optimum of the objective function when it is unique. Under some mild conditions, the support of the solution is also proven to be reached in finite time. Furthermore, some restrictions on the problem specifics allow us to characterize the convergence rate of the system by showing that the LCA converges exponentially fast with an analytically bounded convergence rate. We support our analysis with several illustrative simulations.
机译:我们对本地竞争算法(LCA)进行了分析,这是一种Hopfield式神经网络,可以有效解决稀疏近似问题(例如,仅使用几个非零系数即可从字典中近似向量)。这类问题在神经编码理论和信号处理应用中都起着重要作用。但是,LCA缺乏对其收敛性的分析,并且先前在神经网络上进行非平滑优化的结果不适用于LCA体系结构的细节。我们表明,LCA具有理想的收敛特性,例如稳定性和全局收敛性,以便在目标函数唯一时达到最佳。在某些温和的条件下,也证明可以在有限的时间内获得解决方案的支持。此外,对问题细节的一些限制使我们能够通过显示LCA以有限的收敛收敛速度快速指数收敛来表征系统的收敛速度。我们通过几种示例性模拟来支持我们的分析。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
代理获取

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号