首页> 外文会议>International Joint Conference on Neural Networks >Convergence of a neural network for sparse approximation using the nonsmooth #x0141;ojasiewicz inequality
【24h】

Convergence of a neural network for sparse approximation using the nonsmooth #x0141;ojasiewicz inequality

机译:使用非光滑Łojasiewicz不等式的稀疏近似神经网络的收敛性

获取原文

摘要

Sparse approximation is an optimization program that produces state-of-the-art results in many applications in signal processing and engineering. To deploy this approach in real-time, it is necessary to develop faster solvers than are currently available in digital. The Locally Competitive Algorithm (LCA) is a dynamical system designed to solve the class of sparse approximation problems in continuous time. But before implementing this network in analog VLSI, it is essential to provide performance guarantees. This paper presents new results on the convergence of the LCA neural network. Using recently-developed methods that make use of the Łojasiewicz inequality for nonsmooth functions, we prove that the output and state trajectories converge to a single fixed point. This improves on previous results by guaranteeing convergence to a singleton even when the optimization program has infinitely many and non-isolated solution points.
机译:稀疏近似是一种优化程序,可在信号处理和工程中的许多应用中产生最新的结果。为了实时部署这种方法,有必要开发比当前数字解决方案更快的求解器。局部竞争算法(LCA)是一种动态系统,旨在解决连续时间中的一类稀疏近似问题。但是在模拟VLSI中实现该网络之前,必须提供性能保证。本文提出了有关LCA神经网络收敛性的新结果。使用最近开发的方法,利用Łojasiewicz不等式处理非光滑函数,我们证明了输出和状态轨迹收敛于单个固定点。即使在优化程序具有无限多个且非孤立的求解点的情况下,也可以保证收敛到单例,从而改善了先前的结果。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号