首页> 外文会议>IEEE International Symposium on Information Theory >Optimality of Least-squares for Classification in Gaussian-Mixture Models
【24h】

Optimality of Least-squares for Classification in Gaussian-Mixture Models

机译:高斯混合模型中最小二乘分类的最优性

获取原文

摘要

We consider the problem of learning the coefficients of a linear classifier through Empirical Risk Minimization with a convex loss function in the high-dimensional setting. In particular, we introduce an approach to characterize the best achievable classification risk among convex losses, when data points follow a standard Gaussian-mixture model. Importantly, we prove that the square loss function achieves the minimum classification risk for this data model. Our numerical illustrations verify the theoretical results and show that they are accurate even for relatively small problem dimensions.
机译:我们考虑在高维环境中通过具有凸损失函数的经验风险最小化来学习线性分类器系数的问题。特别是,当数据点遵循标准的高斯混合模型时,我们引入一种方法来表征凸损失中最佳的可实现分类风险。重要的是,我们证明了平方损失函数可实现此数据模型的最小分类风险。我们的数值插图验证了理论结果,并表明即使在相对较小的问题范围内,它们也是准确的。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号