...
首页> 外文期刊>Machine Learning >High-dimensional Bayesian optimization using low-dimensional feature spaces
【24h】

High-dimensional Bayesian optimization using low-dimensional feature spaces

机译:使用低维特征空间的高维贝叶斯优化

获取原文
获取原文并翻译 | 示例
   

获取外文期刊封面封底 >>

       

摘要

Bayesian optimization (BO) is a powerful approach for seeking the global optimum of expensive black-box functions and has proven successful for fine tuning hyper-parameters of machine learning models. However, BO is practically limited to optimizing 10-20 parameters. To scale BO to high dimensions, we usually make structural assumptions on the decomposition of the objective and/or exploit the intrinsic lower dimensionality of the problem, e.g. by using linear projections. We could achieve a higher compression rate with nonlinear projections, but learning these nonlinear embeddings typically requires much data. This contradicts the BO objective of a relatively small evaluation budget. To address this challenge, we propose to learn a low-dimensional feature space jointly with (a) the response surface and (b) a reconstruction mapping. Our approach allows for optimization of BO's acquisition function in the lower-dimensional subspace, which significantly simplifies the optimization problem. We reconstruct the original parameter space from the lower-dimensional subspace for evaluating the black-box function. For meaningful exploration, we solve a constrained optimization problem.
机译:贝叶斯优化(博)是寻求昂贵的黑箱功能的全球最佳的强大方法,并已成功进行机器学习模型的微调超参数。但是,BO实际上限于优化10-20个参数。为了将BO缩放到高维度,我们通常对物镜分解的结构假设和/或利用问题的内在较低维度的结构假设,例如,通过使用线性投影。我们可以通过非线性投影实现更高的压缩速率,但学习这些非线性嵌入式通常需要太多数据。这与相对较小的评估预算相矛盾。为了解决这一挑战,我们建议使用(a)响应表面和(b)重建映射来共同学习低维特征空间。我们的方法允许在低维子空间中优化BO的采集功能,从而显着简化了优化问题。我们从低维子空间重建原始参数空间,以评估黑盒功能。对于有意义的探索,我们解决了一个受限制的优化问题。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号