...
首页> 外文期刊>International Journal of Quantum Chemistry >Nonlinear gradient denoising: Finding accurate extrema from inaccurate functional derivatives
【24h】

Nonlinear gradient denoising: Finding accurate extrema from inaccurate functional derivatives

机译:非线性梯度去噪:从不精确的函数导数中找到准确的极值

获取原文
获取原文并翻译 | 示例
           

摘要

A method for nonlinear optimization with machine learning (ML) models, called nonlinear gradient denoising (NLGD), is developed, and applied with ML approximations to the kinetic energy density functional in an orbital-free density functional theory. Due to systematically inaccurate gradients of ML models, in particular when the data is very high-dimensional, the optimization must be constrained to the data manifold. We use nonlinear kernel principal component analysis (PCA) to locally reconstruct the manifold, enabling a projected gradient descent along it. A thorough analysis of the method is given via a simple model, designed to clarify the concepts presented. Additionally, NLGD is compared with the local PCA method used in previous work. Our method is shown to be superior in cases when the data manifold is highly nonlinear and high dimensional. Further applications of the method in both density functional theory and ML are discussed. (c) 2015 Wiley Periodicals, Inc.
机译:开发了一种使用机器学习(ML)模型进行非线性优化的方法,称为非线性梯度降噪(NLGD),并将其与ML近似一起应用于无轨道密度泛函理论中的动能密度泛函。由于ML模型的系统梯度不准确,尤其是当数据非常高维时,必须将优化限制在数据流形上。我们使用非线性核主成分分析(PCA)来局部重建流形,从而实现沿其的预计梯度下降。通过一个简单的模型对方法进行了详尽的分析,旨在阐明所提出的概念。此外,NLGD与先前工作中使用的本地PCA方法进行了比较。当数据流形是高度非线性和高维的情况下,我们的方法被证明是优越的。讨论了该方法在密度泛函理论和ML中的进一步应用。 (c)2015年威利期刊有限公司

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号