...
首页> 外文期刊>Neurocomputing >A fast DC-based dictionary learning algorithm with the SCAD penalty
【24h】

A fast DC-based dictionary learning algorithm with the SCAD penalty

机译:一种快速直流的基于DC字典学习算法,苏达斯罚款

获取原文
获取原文并翻译 | 示例
   

获取外文期刊封面封底 >>

       

摘要

In recent years, there has been growing concerns on the study of dictionary learning with the nonconvex sparsity-including penalty. However, how to efficiently address the dictionary learning with the nonconvex penalty is still an open problem. In this paper, we present an efficient DC-based algorithm for dictionary learning with the nonconvex smoothly clipped absolute deviation (SCAD) penalty for strong sparsity and accurate estimation. The optimization problem we considered can be generalized as a minimization of the representation error with the SCAD penalty. The approach we proposed is based on a decomposition scheme which decomposes the whole problem into a set of subproblems with regard to single-vector factors. For handling the nonconvexity of the representation error in the subproblems, we use an alternating optimization scheme to update one factor with the other factor fixed. For tackling the nonconvexity of the SCAD penalty in the subproblems, we apply the Difference of Convex functions (DC) technology to convert the nonconvex subproblem into the resulting convex problems and thus employ DC algorithm to solve the corresponding optimization; thus the simple and straightforward solutions in the closed form can be easily derived. As verified by the numerical experiments with synthetic and real-world data, the proposed algorithm performs better than the state-of-the-art algorithms with different sparsity-including constraints. (C) 2020 Elsevier B.V. All rights reserved.
机译:近年来,在与非抗议稀疏 - 包括惩罚的情况下,对字典学习的研究越来越担心。但是,如何用非核心惩罚有效地向字典学习解决仍然是一个公开问题。在本文中,我们介绍了一种高效的基于DC的字典学习算法,与强疤痕性和准确估计的非凸显绝对偏差(扫描)惩罚。我们所考虑的优化问题可以作为仓库惩罚最小化表示错误。我们提出的方法是基于分解方案,该方案将整个问题分解为一组关于单载因素的子问题。为了处理子问题的表示误差的非凸起,我们使用交替的优化方案来更新具有另一个因素的一个因素。为了解决子问题的苏达斯罚球的非凸起,我们应用凸函数(DC)技术的差异将非耦合子问题转换为产生的凸起问题,从而采用DC算法来解决相应的优化;因此,可以容易地导出封闭形式的简单和直接的解决方案。通过具有合成和实际数据的数值实验验证,所提出的算法比使用不同稀疏性的最先进的算法来执行更好的算法 - 包括约束。 (c)2020 Elsevier B.v.保留所有权利。

著录项

  • 来源
    《Neurocomputing》 |2021年第14期|89-100|共12页
  • 作者单位

    Guangdong Univ Technol Sch Automat Guangzhou 510006 Peoples R China|Guangdong Key Lab IoT Informat Technol Guangzhou 510006 Peoples R China;

    Guangdong Univ Technol Sch Automat Guangzhou 510006 Peoples R China|Guangdong Key Lab IoT Informat Technol Guangzhou 510006 Peoples R China;

    Guilin Univ Elect Technol Sch Artificial Intelligence Guilin 541004 Peoples R China;

    Guangdong Univ Technol Fac Sch Automat Guangzhou 510006 Peoples R China|Minist Educ Key Lab Intelligent Detect & Internet Things Mfg Guangzhou 510006 Peoples R China;

    Guangdong Univ Technol Fac Sch Automat Guangzhou 510006 Peoples R China|Guangdong Hong Kong Macao Joint Lab Smart Discre Guangzhou 510006 Peoples R China;

  • 收录信息 美国《科学引文索引》(SCI);美国《工程索引》(EI);
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

    Dictionary learning; Sparse representation; Nonconvex penalty; Difference of Convex functions (DC); Alternating optimization;

    机译:字典学习;稀疏表示;非渗透罚分;凸函数(DC)的差异;交替优化;

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号