首页> 外文会议>International Conference on Fuzzy Theory and Its Applications >Comparison of Gradient Descent Methods in Online Fuzzy Co-clustering
【24h】

Comparison of Gradient Descent Methods in Online Fuzzy Co-clustering

机译:在线模糊共聚中梯度下降方法的比较

获取原文

摘要

Fuzzy co-clustering schemes including Fuzzy Co-Clustering induced by Multinomial Mixture models (FCCMM) are promising approaches for analyzing object-item cooccur-renee information such as document-keyword frequencies and customer-product purchase history transactions. However, such cooccurrence datasets are generally maintained as very large matrices and cannot be dealt with conventional batch algorithms. In order to deal with such problems, online FCCMM (OFC-CMM) that sequentially loads a single object has been proposed. Conventional OFCCMM uses stochastic gradient descent (SGD) to update parameters. SGD generally has drawbacks that convergence is slow and it is susceptible to vibration state and a saddle point. Many improvements on SGD have been proposed such as Momentum SGD, Nesterov's accelerated gradient method, AdaGrad, and Adam. In this study, we introduce various gradient descent methods into OFCCMM and observe their characteristics and performance through numerical experiments.
机译:模糊联合聚类方案包括多项混合模型(FCCMM)引起的模糊联合聚类是有前途的方法,用于分析对象的项目cooccur,芮妮信息,如文档的关键字频率和客户产品的购买历史记录。然而,这样的同现度的数据集通常被保持为非常大的矩阵,并且不能与常规批量算法来处理。为了解决这样的问题,在线FCCMM(OFC-CMM),其依次加载单个对象已经提出。常规OFCCMM使用随机梯度下降(SGD)到更新参数。 SGD通常具有缺点,即收敛速度慢,它易受振动状态和鞍点。在SGD许多改进已经被提出,如动量SGD,涅斯捷罗夫的加速梯度法,AdaGrad,和亚当。在这项研究中,我们介绍了各种梯度下降法进入OFCCMM并通过数值实验观察它们的特性和性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号