首页> 外文期刊>Journal of nonparametric statistics >On shrinking minimax convergence in nonparametric statistics
【24h】

On shrinking minimax convergence in nonparametric statistics

机译:关于非参数统计中的收缩极小极大收敛

获取原文
获取原文并翻译 | 示例
       

摘要

'...if we are prepared to assume that the unknown density has k derivatives, then ... the optimal mean integrated squared error is of order n~(-2k/(2k+1)) ...' The citation is from Silverman [(1986), Density Estimation for Statistics and Data Analysis, London: Chapman & Hall] and its assertion is based on a classical minimax lower bound which is the pillar of the modern nonparametric statistics. This paper proposes a new minimax methodology that implies a faster decreasing minimax lower bound that is attainable by a data-driven estimator, and the same estimator is also minimax under the classical approach. The recommendation is to test performance of estimators via the new and classical minimax approaches.
机译:'...如果我们准备假定未知密度具有k个导数,则...最佳均方误差为n〜(-2k /(2k + 1))...'来自Silverman [(1986),《统计和数据分析的密度估计》,伦敦:Chapman&Hall],其断言基于经典的minimax下界,它是现代非参数统计的基础。本文提出了一种新的最小极大值方法,该方法意味着可以通过数据驱动的估计器更快地减小最小极大值下限,并且在经典方法下,相同的估计器也是最小极大值。建议通过新的和经典的minimax方法测试估计器的性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号