首页> 外文期刊>Journal of Chemometrics >Regression by L1 regularization of smart contrasts and sums (ROSCAS) beats PLS and elastic net in latent variable model
【24h】

Regression by L1 regularization of smart contrasts and sums (ROSCAS) beats PLS and elastic net in latent variable model

机译:通过L1正则化的智能对比和和(ROSCAS)回归在潜在变量模型中击败PLS和弹性网

获取原文
获取原文并翻译 | 示例
           

摘要

This paper proposes a regression method, ROSCAS, which regularizes smart contrasts and sums of regression coefficients by an L1 penalty. The contrasts and sums are based on the sample correlation matrix of the predictors and are suggested by a latent variable regression model. The contrasts express the idea that a priori correlated predictors should have similar coefficients. The method has excellent predictive performance in situations, where there are groups of predictors with each group representing an independent feature that influences the response. In particular, when the groups differ in size, ROSCAS can outperform LASSO, elastic net, partial least squares (PLS) and ridge regression by a factor of two or three in terms of mean squared error. In other simulation setups and on real data, ROSCAS performs competitively.
机译:本文提出了一种回归方法ROSCAS,它通过L1罚则对智能对比度和回归系数之和进行正则化。对比度和总和基于预测变量的样本相关矩阵,并由潜在变量回归模型建议。对比表达了这样的想法,即先验相关的预测变量应具有相似的系数。在存在多组预测变量的情况下,该方法具有出色的预测性能,每组预测变量代表一个影响响应的独立特征。尤其是,当组的大小不同时,ROSCAS的均方误差可能比LASSO,弹性网,偏最小二乘(PLS)和岭回归好两倍或三倍。在其他仿真设置和实际数据中,ROSCAS表现出色。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号