...
首页> 外文期刊>Applied Soft Computing >Guided Stochastic Gradient Descent Algorithm for inconsistent datasets
【24h】

Guided Stochastic Gradient Descent Algorithm for inconsistent datasets

机译:用于不一致的数据集的导向随机梯度下降算法

获取原文
获取原文并翻译 | 示例
           

摘要

Stochastic Gradient Descent (SGD) Algorithm, despite its simplicity, is considered an effective and default standard optimization algorithm for machine learning classification models such as neural networks and logistic regression. However, SGD's gradient descent is biased towards the random selection of a data instance. In this paper, it has been termed as data inconsistency. The proposed variation of SGD, Guided Stochastic Gradient Descent (GSGD) Algorithm, tries to overcome this inconsistency in a given dataset through greedy selection of consistent data instances for gradient descent. The empirical test results show the efficacy of the method. Moreover, GSGD has also been incorporated and tested with other popular variations of SGD, such as Adam, Adagrad and Momentum. The guided search with GSGD achieves better convergence and classification accuracy in a limited time budget than its original counterpart of canonical and other variation of SGD. Additionally, it maintains the same efficiency when experimented on medical benchmark datasets with logistic regression for classification. (C) 2018 Elsevier B.V. All rights reserved.
机译:随机梯度下降(SGD)算法,尽管其简单性,被认为是用于机器学习分类模型的有效且默认的标准优化算法,例如神经网络和逻辑回归。然而,SGD的梯度下降偏向于数据实例的随机选择。在本文中,它被称为数据不一致。 SGD的提出变化,引导随机梯度下降(GSGD)算法,尝试通过贪婪选择梯度下降的一致数据实例克服给定的数据集中的这种不一致。经验测试结果显示了该方法的功效。此外,GSGD也已经用SGD的其他流行变体纳入并测试,例如ADAM,Adagagrad和势头。使用GSGD的指导搜索在有限的时间预算中实现了更好的收敛性和分类准确性,而不是SGD的规范和其他变化的原始对手。此外,它在使用Logistic回归进行分类时实验时保持相同的效率。 (c)2018 Elsevier B.v.保留所有权利。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号