首页> 外文会议>IEEE Statistical Signal Processing Workshop >Sparse Reduced Rank Regression with Nonconvex Regularization
【24h】

Sparse Reduced Rank Regression with Nonconvex Regularization

机译:具有非凸正则化的稀疏降秩回归

获取原文

摘要

In this paper, the estimation problem for sparse reduced rank regression (SRRR) model is considered. The SRRR model is widely used for dimension reduction and variable selection with applications in signal processing, econometrics, etc. The problem is formulated to minimize the least squares loss with a sparsity-inducing penalty considering an orthogonality constraint. Convex sparsity-inducing functions have been used for SRRR in literature. In this work, a nonconvex function is proposed for better sparsity inducing. An efficient algorithm is developed based on the alternating minimization (or projection) method to solve the nonconvex optimization problem. Numerical simulations show that the proposed algorithm is much more efficient compared to the benchmarks and the nonconvex function can result in a better estimation accuracy.
机译:本文考虑了稀疏降秩回归模型的估计问题。 SRRR模型被广泛用于尺寸缩减和变量选择,并在信号处理,计量经济学等方面得到了应用。考虑到正交性约束,提出了该问题以最小化最小二乘损失,并具有稀疏性导致的代价。文献中将凸稀疏诱导函数用于SRRR。在这项工作中,提出了一个非凸函数来更好地诱导稀疏性。基于交替最小化(或投影)方法,开发了一种有效的算法来解决非凸优化问题。数值仿真表明,与基准相比,该算法效率更高,非凸函数可以提高估计精度。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号