首页> 外文会议>Pacific-Asia conference on knowledge discovery and data mining >Preconditioning an Artificial Neural Network Using Naive Bayes
【24h】

Preconditioning an Artificial Neural Network Using Naive Bayes

机译:使用朴素贝叶斯预处理人工神经网络

获取原文

摘要

Logistic Regression (LR) is a workhorse of the statistics community and a state-of-the-art machine learning classifier. It learns a linear model from inputs to outputs trained by optimizing the Conditional Log-Likelihood (CLL) of the data. Recently, it has been shown that preconditioning LR using a Naive Bayes (NB) model speeds up LR learning many-fold. One can, however, train a linear model by optimizing the mean-square-error (MSE) instead of CLL. This leads to an Artificial Neural Network (ANN) with no hidden layer. In this work, we study the effect of NB preconditioning on such an ANN classifier. Optimizing MSE instead of CLL may lead to a lower bias classifier and hence result in better performance on big datasets. We show that this NB preconditioning can speed-up convergence significantly. We also show that optimizing a linear model with MSE leads to a lower bias classifier than optimizing with CLL. We also compare the performance to state-of-the-art classifier Random Forest.
机译:Logistic回归(LR)是统计界的主力军,也是最新的机器学习分类器。它通过优化数据的条件对数似然(CLL)来学习从输入到输出的线性模型。最近,已经显示出使用朴素贝叶斯(Naive Bayes,NB)模型对LR进行预处理可以将LR学习提高很多倍。但是,可以通过优化均方误差(MSE)而不是CLL来训练线性模型。这导致没有隐藏层的人工神经网络(ANN)。在这项工作中,我们研究了NB预处理对这种ANN分类器的影响。优化MSE而不是CLL可能会导致偏差分类器降低,从而在大型数据集上实现更好的性能。我们表明,这种NB预处理可以显着加快收敛速度​​。我们还表明,与使用CLL优化相比,使用MSE优化线性模型可导致更低的偏差分类器。我们还将性能与最先进的分类器“随机森林”进行比较。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号