...
首页> 外文期刊>JMLR: Workshop and Conference Proceedings >Near-Optimal Sample Complexity Bounds for Maximum Likelihood Estimation of Multivariate Log-concave Densities
【24h】

Near-Optimal Sample Complexity Bounds for Maximum Likelihood Estimation of Multivariate Log-concave Densities

机译:多元对数凹凹密度的最大似然估计的近似最佳样本复杂度界

获取原文
   

获取外文期刊封面封底 >>

       

摘要

We study the problem of learning multivariate log-concave densities with respect to a global loss function. We obtain the first upper bound on the sample complexity of the maximum likelihood estimator (MLE) for a log-concave density on $mathbb{R}^d$, for all $d geq 4$. Prior to this work, no finite sample upper bound was known for this estimator in more than $3$ dimensions. In more detail, we prove that for any $d geq 1$ and $epsilon>0$, given $ilde{O}_d((1/epsilon)^{(d+3)/2})$ samples drawn from an unknown log-concave density $f_0$ on $mathbb{R}^d$, the MLE outputs a hypothesis $h$ that with high probability is $epsilon$-close to $f_0$, in squared Hellinger loss. A sample complexity lower bound of $Omega_d((1/epsilon)^{(d+1)/2})$ was previously known for any learning algorithm that achieves this guarantee. We thus establish that the sample complexity of the log-concave MLE is near-optimal, up to an $ilde{O}(1/epsilon)$ factor.
机译:我们研究关于全局损失函数学习多元对数-凹面密度的问题。对于所有$ d geq 4 $,我们获得了对$ mathbb {R} ^ d $的对数凹密度的最大似然估计器(MLE)的样本复杂度的第一个上限。在进行此工作之前,对于此估计量,在超过$ 3 $的维度上,尚无有限的样本上限。更详细地说,我们证明对于给定$ tilde {O} _d((1 / epsilon)^ {(d + 3)/ 2})$,对于任何$ d geq 1 $和$ epsilon> 0 $从未知的对数凹凹密度$ f_0 $取自$ mathbb {R} ^ d $的样本,MLE输出假设$ h $,其概率高为$ epsilon $-接近$ f_0 $,以平方的Hellinger失利。先前为实现此保证的任何学习算法都知道$ Omega_d((1 / epsilon)^ {(d + 1)/ 2})$的样本复杂度下限。因此,我们确定对数凹面MLE的样本复杂度接近最佳,直到$ tilde {O}(1 / epsilon)$因子。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号