首页> 外文会议>International Conference on Operations Research >The q-Gradient Vector for Unconstrained Continuous Optimization Problems
【24h】

The q-Gradient Vector for Unconstrained Continuous Optimization Problems

机译:针对无约束连续优化问题的Q梯度向量

获取原文

摘要

In the beginning of nineteenth century, Frank Hilton Jackson generalized the concepts of derivative in the q -calculus context and created the q -derivative, widely known as Jackson's derivative. In the q -derivative, the independent variable is multiplied by a parameter q and in the limit, q→1 , the q -derivative is reduced to the classical derivative. In this work we make use of the first-order partial q -derivatives of a function of n variables to define here the q -gradient vector and take the negative direction as a new search direction for optimization methods. Therefore, we present a q -version of the classical steepest descent method called the q -steepest descent method, that is reduced to the classical version whenever the parameter q is equal to 1 . We applied the classical steepest descent method and the q -steepest descent method to an unimodal and a multimodal test function. The results show the great performance of the q -steepest descent method, and for the multimodal function it was able to escape from many local minima and reach the global minimum.
机译:在十九世纪初,杰克希尔顿杰克逊在Q-Calculus环境中概括了衍生品的概念,并创造了Q -Derivative,被广泛称为杰克逊的衍生品。在Q-长期性中,独立变量乘以参数Q,并且在极限中,Q→1,Q-长期降低到经典衍生物。在这项工作中,我们利用N个变量的一阶部分q -derivatives来定义Q -Gradient向量,并将负方向作为优化方法的新搜索方向定义。因此,我们呈现了称为Q -Steepest下降方法的经典速下降方法的Q -Version,每当参数Q等于1时,这减少到经典版本。我们将古典陡峭的脱刻方法和q -steepest脱刻法应用于单峰和多模式测试函数。结果表明了Q -Steepest下降方法的良好性能,并且对于多模式功能,它能够从许多本地最小值逃脱并达到全局最小值。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号