【24h】

A New Sparse Restricted Boltzmann Machine

机译:新型稀疏受限玻尔兹曼机

获取原文
获取原文并翻译 | 示例
           

摘要

Although existing sparse restricted Boltzmann machine (SRBM) can make some hidden units activated, the major disadvantage is that the sparseness of data distribution is usually over-looked and the reconstruction error becomes very large after the hidden unit variables become sparse. Different from the SRBMs which only incorporate a sparse constraint term in the energy function formula from the original restricted Boltzmann machine (RBM), an energy function constraint SRBM (ESRBM) is proposed in this paper. The proposed ESRBM takes into account the sparseness of the data distribution so that the learned features can better reflect the intrinsic features of data. Simulations show that compared with SRBM, ESRBM has smaller reconstruction error and lower computational complexity, and that for supervised learning classification, ESRBM obtains higher accuracy rates than SRBM, classification RBM, and Softmax classifier.
机译:尽管现有的稀疏受限玻尔兹曼机(SRBM)可以激活某些隐藏单元,但主要缺点是通常会忽略数据分布的稀疏性,并且在隐藏单元变量变得稀疏之后重构误差变得非常大。与最初的受限玻尔兹曼机器(RBM)相比,在能量函数公式中仅包含稀疏约束项的SRBM,本文提出了一种能量函数约束SRBM(ESRBM)。提出的ESRBM考虑了数据分布的稀疏性,因此学习到的特征可以更好地反映数据的固有特征。仿真表明,与SRBM相比,ESRBM具有较小的重构误差和较低的计算复杂度,并且对于有监督的学习分类,ESRBM比SRBM,分类RBM和Softmax分类器具有更高的准确率。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号