首页> 外文期刊>Journal of Computational and Applied Mathematics >Incremental DC optimization algorithm for large-scale clusterwise linear regression
【24h】

Incremental DC optimization algorithm for large-scale clusterwise linear regression

机译:大规模集群线性回归的增量直流优化算法

获取原文
获取原文并翻译 | 示例
获取外文期刊封面目录资料

摘要

The objective function in the nonsmooth optimization model of the clusterwise linear regression (CLR) problem with the squared regression error is represented as a difference of two convex functions. Then using the difference of convex algorithm (DCA) approach the CLR problem is replaced by the sequence of smooth unconstrained optimization subproblems. A new algorithm based on the DCA and the incremental approach is designed to solve the CLR problem. We apply the Quasi-Newton method to solve the subproblems. The proposed algorithm is evaluated using several synthetic and real world data sets for regression and compared with other algorithms for CLR. Results demonstrate that the DCA based algorithm is efficient for solving CLR problems with the large number of data points and in particular, outperforms other algorithms when the number of input variables is small. (C) 2020 Elsevier B.V. All rights reserved.
机译:具有平方回归误差的聚类线性回归(CLR)问题的非光滑优化模型中的目标函数表示为两个凸函数的差。然后使用凸差分算法(DCA)将CLR问题替换为光滑无约束优化子问题序列。设计了一种基于DCA和增量方法的新算法来解决CLR问题。我们用拟牛顿法求解这些子问题。使用多个用于回归的合成数据集和真实数据集对所提出的算法进行了评估,并与其他用于CLR的算法进行了比较。结果表明,基于DCA的算法对于解决具有大量数据点的CLR问题是有效的,尤其是在输入变量较少的情况下,其性能优于其他算法。(C) 2020爱思唯尔B.V.版权所有。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号