首页> 外文期刊>Structural and multidisciplinary optimization >The application of gradient-only optimization methods for problems discretized using non-constant methods
【24h】

The application of gradient-only optimization methods for problems discretized using non-constant methods

机译:仅梯度优化方法在使用非恒定方法离散化的问题中的应用

获取原文
获取原文并翻译 | 示例
           

摘要

We study the minimization of objective functions containing non-physical jump discontinuities. These discontinuities arise when (partial) differential equations are discretized using non-constant methods and the resulting numerical solutions are used in computing the objective function. Although the functions may become discontinuous, gradient information may be computed at every point. Gradient information is computable everywhere since every point has an associated discretization for which (semi) analytical sensitivities can be calculated. Rather than the construction of global approximations using only function value information to overcome the discontinuities, we propose to use only the gradient information. We elaborate on the modifications of classical gradient based optimization algorithms for use in gradient-only approaches, and we then present gradient-only optimization strategies using both BFGS and a new spherical quadratic approximation for sequential approximate optimization (SAO). We then use the BFGS and SAO algorithms to solve three problems of practical interest, both unconstrained and constrained.
机译:我们研究了包含非物理跳跃不连续性的目标函数的最小化。当使用非恒定方法离散(偏)微分方程并将所得的数值解用于计算目标函数时,会出现这些不连续性。尽管功能可能变得不连续,但是可以在每个点上计算梯度信息。梯度信息在任何地方都是可计算的,因为每个点都有一个相关的离散度,可以为该离散度计算(半)分析灵敏度。除了提议仅使用函数值信息来克服不连续性来构造全局逼近,我们建议仅使用梯度信息。我们详细介绍了经典的基于梯度的优化算法在仅梯度方法中的使用方法,然后介绍了同时使用BFGS和新的球面二次逼近进行序列近似优化(SAO)的仅梯度优化策略。然后,我们使用BFGS和SAO算法来解决三个实际问题,即无约束和受约束。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号