首页> 外文会议>International Conference on Algorithms and Architectures for Parallel Processing >Broad Learning System with Proportional-Integral-Differential Gradient Descent
【24h】

Broad Learning System with Proportional-Integral-Differential Gradient Descent

机译:具有比例 - 积分差分梯度下降的广泛学习系统

获取原文

摘要

Broad learning system (BLS) has attracted much attention in recent years due to its fast training speed and good generalization ability. Most of the existing BLS-based algorithms use the least square method to calculate its output weights. As the size of the training data set increases, this approach will cause the training efficiency of the model to be seriously reduced, and the solution of the model will also be unstable. To solve this problem, we have designed a new gradient descent method (GD) based on the proportional-integral-differential technique (PID) to replace the least square operation in the existing BLS algorithms, which is called PID-GD-BLS. Extensive experimental results on four benchmark data sets show that PID-GD can achieve faster convergence rate than traditional optimization algorithms such as Adam and AdaMod, and the generalization performance and stability of the PID-GD-BLS are much better than that of BLS and its variants. This study provides a new direction for BLS optimization and a better solution for BLS-based data mining.
机译:广泛学习系统(BLS)已经吸引了近几年备受关注,由于其训练速度快和良好的泛化能力。大多数的基于BLS现有算法使用最小二乘法来计算其输出的权重。作为训练数据集的大小增加,这种方法将导致模型的训练效率被严重降低,并且模型的解决方案也将是不稳定的。为了解决这个问题,我们设计了基于比例 - 积分 - 微分技术(PID)的新梯度下降法(GD),以取代在现有BLS算法,其被称为PID-GD-BLS最小二乘操作。四个标准数据集上的广泛实验结果表明,PID-GD可以实现更快的收敛速度比传统的优化算法,如亚当和AdaMod和PID-GD-BLS的泛化性能和稳定性都高于BLS和更好的变种。本研究为BLS的优化和新的方向基于BLS-数据挖掘更好的解决方案。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号