首页> 外文期刊>Neurocomputing >Upper bound of Bayesian generalization error in non-negative matrix factorization
【24h】

Upper bound of Bayesian generalization error in non-negative matrix factorization

机译:非负矩阵分解中的贝叶斯泛化误差上限

获取原文
获取原文并翻译 | 示例
           

摘要

Non-negative matrix factorization (NMF) is a new knowledge discovery method that is used for text mining, signal processing, bioinformatics, and consumer analysis. However, its basic property as a learning machine is not yet clarified, as it is not a regular statistical model, resulting that theoretical optimization method of NMF has not yet established. In this paper, we study the real log canonical threshold of NMF and give an upper bound of the generalization error in Bayesian learning. The results show that the generalization error of the matrix factorization can be made smaller than regular statistical models if Bayesian learning is applied. (C) 2017 Elsevier B.V. All rights reserved.
机译:非负矩阵分解(NMF)是一种新的知识发现方法,可用于文本挖掘,信号处理,生物信息学和消费者分析。然而,由于它不是一个常规的统计模型,它作为学习机的基本特性尚未阐明,导致NMF的理论优化方法尚未建立。在本文中,我们研究了NMF的实数对数规范阈值,并给出了贝叶斯学习中泛化误差的上限。结果表明,如果应用贝叶斯学习,可以使矩阵分解的泛化误差小于常规统计模型。 (C)2017 Elsevier B.V.保留所有权利。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号