首页> 外文会议>International Conference on ICT for Smart Society >Gradient descent and normal equations on cost function minimization for online predictive using linear regression with multiple variables
【24h】

Gradient descent and normal equations on cost function minimization for online predictive using linear regression with multiple variables

机译:使用多变量线性回归的在线预测的成本函数最小化的梯度下降和正态方程

获取原文

摘要

The cost function minimization is essential in finding a good model for linear regression. This paper works on prototyping and examining the minimizing cost function's two known algorithms for online predictive, namely gradient descent and normal equations. The data used in this paper are found in Open Data and split into three parts, training, test, and cross validation datasets. Empirical results are given on number of datasets, showing that normal equation performs better than gradient descent (with cross correlation 0.0117 higher and relative absolute error 0.5154 less).
机译:成本函数最小化对于寻找线性回归的良好模型至关重要。本文致力于原型设计和检验最小化成本函数的两种在线预测算法,即梯度下降法和正态方程。本文中使用的数据可在“开放数据”中找到,并分为三个部分:训练,测试和交叉验证数据集。在大量数据集上给出了经验结果,表明法线方程的性能优于梯度下降(互相关高0.0117,相对绝对误差小0.5154)。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号