首页> 外文期刊>Journal of applied statistics >Robust sparse regression by modeling noise as a mixture of gaussians
【24h】

Robust sparse regression by modeling noise as a mixture of gaussians

机译:通过将噪声建模为高斯混合来进行鲁棒的稀疏回归

获取原文
获取原文并翻译 | 示例
       

摘要

Regression analysis has been proven to be a quite effective tool in a large variety of fields. In many regression models, it is often assumed that noise is with a specific distribution. Although the theoretical analysis can be greatly facilitated, the model-fitting performance may be poor since the supposed noise distribution may deviate from real noise to a large extent. Meanwhile, the model is also expected to be robust in consideration of the complexity of real-world data. Without any assumption about noise, we propose in this paper a novel sparse regression method called MoG-Lasso to directly model noise in linear regression models via a mixture of Gaussian distributions (MoG). Meanwhile, the penalty is included as a part of the loss function of MoG-Lasso to enhance its ability to identify a sparse model. As for the parameters in MoG-Lasso, we present an efficient algorithm to estimate them via the EM (expectation maximization) and ADMM (alternating direction method of multipliers) algorithms. With some simulated and real data contaminated by complex noise, the experiments show that the novel model MoG-Lasso performs better than several other popular methods in both 'pn' and 'pn' situations, including Lasso, LAD-Lasso and Huber-Lasso.
机译:回归分析已被证明是在许多领域中非常有效的工具。在许多回归模型中,通常假定噪声具有特定的分布。尽管可以大大简化理论分析,但由于假定的噪声分布可能在很大程度上偏离实际噪声,因此模型拟合性能可能较差。同时,考虑到现实世界数据的复杂性,该模型也有望变得强大。在没有任何噪声假设的情况下,我们提出了一种新颖的稀疏回归方法,称为MoG-Lasso,可以通过混合高斯分布(MoG)直接对线性回归模型中的噪声进行建模。同时,惩罚被包括作为MoG-Lasso损失函数的一部分,以增强其识别稀疏模型的能力。至于MoG-Lasso中的参数,我们提出了一种有效的算法,可以通过EM(期望最大化)和ADMM(乘数的交替方向方法)算法对它们进行估计。在一些模拟和真实数据被复杂噪声污染的情况下,实验表明,新模型MoG-Lasso在'p> n'和'p

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号