首页> 外文期刊>Neural Networks and Learning Systems, IEEE Transactions on >Convergence Analysis of Single Latent Factor-Dependent, Nonnegative, and Multiplicative Update-Based Nonnegative Latent Factor Models
【24h】

Convergence Analysis of Single Latent Factor-Dependent, Nonnegative, and Multiplicative Update-Based Nonnegative Latent Factor Models

机译:单潜因素依赖性,非负和基于乘法更新的非负潜在因子模型的收敛分析

获取原文
获取原文并翻译 | 示例

摘要

A single latent factor (LF)-dependent, nonnegative, and multiplicative update (SLF-NMU) learning algorithm is highly efficient in building a nonnegative LF (NLF) model defined on a high-dimensional and sparse (HiDS) matrix. However, convergence characteristics of such NLF models are never justified in theory. To address this issue, this study conducts rigorous convergence analysis for an SLF-NMU-based NLF model. The main idea is twofold: 1) proving that its learning objective keeps nonincreasing with its SLF-NMU-based learning rules via constructing specific auxiliary functions; and 2) proving that it converges to a stable equilibrium point with its SLF-NMU-based learning rules via analyzing the Karush-Kuhn-Tucker (KKT) conditions of its learning objective. Experimental results on ten HiDS matrices from real applications provide numerical evidence that indicates the correctness of the achieved proof.
机译:单个潜在因子(LF) - 依存,非负和乘法更新(SLF-NMU)学习算法在构建在高维和稀疏(HID)矩阵上定义的非负LF(NLF)模型方面是高效的。然而,这种NLF模型的收敛特征在理论上绝不是合理的。为了解决这个问题,本研究对基于SLF-NMU的NLF模型进行严格的收敛性分析。主要思想是双重的:1)证明其学习目标通过构建特定的辅助职能而不断地与其SLF-NMU的学习规则保持不利; 2)证明它通过分析其学习目标的Karush-Kuhn-Tucker(KKT)条件,通过其SLF-NMU的学习规则将其融合到稳定的均衡点。来自真实应用的十个HID矩阵的实验结果提供了数字证据,表明了所取得的证明的正确性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号