$L_{1}$ constraint. A convex function induces a Riemanni'/> Minkovskian Gradient for Sparse Optimization
首页> 外文期刊>Selected Topics in Signal Processing, IEEE Journal of >Minkovskian Gradient for Sparse Optimization
【24h】

Minkovskian Gradient for Sparse Optimization

机译:Minkovskian稀疏优化梯度

获取原文
获取原文并翻译 | 示例
       

摘要

Information geometry is used to elucidate convex optimization problems under $L_{1}$ constraint. A convex function induces a Riemannian metric and two dually coupled affine connections in the manifold of parameters of interest. A generalized Pythagorean theorem and projection theorem hold in such a manifold. An extended LARS algorithm, applicable to both under-determined and over-determined cases, is studied and properties of its solution path are given. The algorithm is shown to be a Minkovskian gradient-descent method, which moves in the steepest direction of a target function under the Minkovskian $L_{1}$ norm. Two dually coupled affine coordinate systems are useful for analyzing the solution path.
机译:信息几何用于阐明 $ L_ {1} $ 约束下的凸优化问题。凸函数在感兴趣的参数流形中产生一个黎曼度量和两个双重耦合的仿射连接。广义毕达哥拉斯定理和投影定理在这种流形中成立。研究了适用于欠定和超定情况的扩展LARS算法,并给出了其求解路径的性质。该算法显示为Minkovskian梯度下降法,可以在Minkovskian $ L_ {1} $ 规范。两个双重耦合的仿射坐标系对于分析求解路径很有用。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号