...
首页> 外文期刊>Neurocomputing >Smoothing quantile regression for a distributed system
【24h】

Smoothing quantile regression for a distributed system

机译:Smoothing quantile regression for a distributed system

获取原文
获取原文并翻译 | 示例
           

摘要

Quantile regression has become a popular alternative to least squares regression for providing a compre-hensive description of the response distribution, and robustness against heavy-tailed error distributions. However, the nonsmooth quantile loss poses new challenges to distributed estimation in both computa-tion and theoretical development. To address this challenge, we use a convolution-type smoothing approach and its Taylor expression to transform the nondifferentiable quantile loss function into a con-vex quadratic loss function, which admits a fast and scalable algorithm to perform optimization under massive and high-dimensional data. The proposed distributed estimators are both computationally and communication efficient. Moreover, only the gradient information is communicated at each iteration. Theoretically, we show that, after a certain number of iterations, the resulting estimator is statistically as efficient as the global estimator without any restriction on the number of machines. Both simulations and data analysis are conducted to illustrate the finite sample performance of the proposed methods. (c) 2021 Elsevier B.V. All rights reserved.

著录项

获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号