首页> 外文会议>Brazilian Conference on Intelligent Systems >Evaluating Methods for Constant Optimization of Symbolic Regression Benchmark Problems
【24h】

Evaluating Methods for Constant Optimization of Symbolic Regression Benchmark Problems

机译:符号回归基准问题不断优化的评估方法

获取原文

摘要

Constant optimization in symbolic regression is an important task addressed by several researchers. It has been demonstrated that continuous optimization techniques are adequate to find good values for the constants by minimizing the prediction error. In this paper, we evaluate several continuous optimization methods that can be used to perform constant optimization in symbolic regression. We have selected 14 well-known benchmark problems and tested the performance of diverse optimization methods in finding the expected constant values, assuming that the correct formula has been found. The results show that Levenberg-Marquardt presented the highest success rate among the evaluated methods, followed by Powell's and Nelder-Mead's Simplex. However, two benchmark problems were not solved, and for two other problems the Levenberg-Marquardt was largely outperformed by Nelder-Mead Simplex in terms of success rate. We conclude that even though a symbolic regression technique may find the correct formula, constant optimization may fail, thus, this may also happen during the search for a formula and may guide the method towards the wrong solution. Also, the efficiency of LM in finding high-quality solutions by using only a few function evaluations could serve as inspiration for the development of better symbolic regression methods.
机译:符号回归中的不断优化是几位研究人员解决的一项重要任务。已经证明,连续优化技术足以通过最小化预测误差来找到常数的良好值。在本文中,我们评估了几种可用于执行符号回归中的常量优化的连续优化方法。假设找到了正确的公式,我们选择了14个众所周知的基准问题,并测试了各种优化方法在寻找期望常数值方面的性能。结果表明,在评估方法中,Levenberg-Marquardt的成功率最高,其次是Powell和Nelder-Mead的Simplex。但是,两个基准测试问题仍未解决,就其他两个问题而言,Levenberg-Marquardt的成功率在很大程度上要优于Nelder-Mead Simplex。我们得出的结论是,即使符号回归技术可以找到正确的公式,但常数优化可能会失败,因此,这也可能在搜索公式的过程中发生,并可能将方法引向错误的解决方案。同样,仅使用少量函数评估就可以找到高质量的解决方案,而LM的效率也可以为开发更好的符号回归方法提供灵感。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号