...
首页> 外文期刊>Neural Networks and Learning Systems, IEEE Transactions on >Efficient Probabilistic Classification Vector Machine With Incremental Basis Function Selection
【24h】

Efficient Probabilistic Classification Vector Machine With Incremental Basis Function Selection

机译:增量基函数选择的高效概率分类向量机

获取原文
获取原文并翻译 | 示例
   

获取外文期刊封面封底 >>

       

摘要

Probabilistic classification vector machine (PCVM) is a sparse learning approach aiming to address the stability problems of relevance vector machine for classification problems. Because PCVM is based on the expectation maximization algorithm, it suffers from sensitivity to initialization, convergence to local minima, and the limitation of Bayesian estimation making only point estimates. Another disadvantage is that PCVM was not efficient for large data sets. To address these problems, this paper proposes an efficient PCVM (EPCVM) by sequentially adding or deleting basis functions according to the marginal likelihood maximization for efficient training. Because of the truncated prior used in EPCVM, two approximation techniques, i.e., Laplace approximation and expectation propagation (EP), have been used to implement EPCVM to obtain full Bayesian solutions. We have verified Laplace approximation and EP with a hybrid Monte Carlo approach. The generalization performance and computational effectiveness of EPCVM are extensively evaluated. Theoretical discussions using Rademacher complexity reveal the relationship between the sparsity and the generalization bound of EPCVM.
机译:概率分类向量机(PCVM)是一种稀疏学习方法,旨在解决分类问题中相关向量机的稳定性问题。由于PCVM基于期望最大化算法,因此具有初始化敏感性,对局部极小值的收敛性以及仅进行点估计的贝叶斯估计的局限性。另一个缺点是PCVM对于大型数据集效率不高。为了解决这些问题,本文提出了一种有效的PCVM(EPCVM),方法是根据边际似然最大化来依次添加或删除基函数,以进行有效训练。由于在EPCVM中使用了先验截断法,因此已经使用两种近似技术,即拉普拉斯近似和期望传播(EP)来实现EPCVM,以获得完整的贝叶斯解。我们已经使用混合蒙特卡罗方法验证了Laplace逼近和EP。对EPCVM的泛化性能和计算效率进行了广泛的评估。使用Rademacher复杂度的理论讨论揭示了EPCVM的稀疏性与泛化范围之间的关系。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号