首页> 外文会议>Evolutionary Computation, 2004. CEC2004. Congress on >Evolutionary algorithms based on machine learning accelerate mathematical function optimization but not neural net evolution
【24h】

Evolutionary algorithms based on machine learning accelerate mathematical function optimization but not neural net evolution

机译:基于机器学习的进化算法可加速数学函数优化,但不能加速神经网络进化

获取原文

摘要

For a decade, the second author has been dreaming of and working towards building artificial brains that consist of tens of thousands of evolved neural net circuit modules that are assembled according to the designs of human brain architects (BAs). The bottleneck with this approach is the slow evolution time of the modules (using software techniques in PCs). However, using Michalski's machine learning based evolutionary algorithms, such as "LEM (learnable evolution model)", the usual evolution time (for certain categories of applications, e.g. mathematical function optimization) can be reduced by a factor of hundreds (Michalski, 2000). The authors hoped that this breakthrough would allow neural net modules to be evolved far more quickly. Unfortunately, it appears that the LEM approach does not work well with the evolution of dynamic neural nets. This may be due to a combinatorial explosion of attribute-variable pairs arising during the machine-learning mode that poses a problem during the evolution of dynamic signals.
机译:十年来,第二作者一直梦想着并致力于构建由成千上万个根据人类大脑建筑师(BAs)的设计组装而成的演化神经网络电路模块组成的人工大脑。这种方法的瓶颈是模块的缓慢开发时间(使用PC中的软件技术)。但是,使用Michalski基于机器学习的进化算法,例如“ LEM(可学习的进化模型)”,通常的进化时间(对于某些类别的应用,例如数学函数优化)可以减少数百倍(Michalski,2000) 。作者希望这一突破将使神经网络模块的发展更快。不幸的是,LEM方法似乎不适用于动态神经网络。这可能是由于在机器学习模式期间出现的属性变量对的组合爆炸,这在动态信号的演化过程中造成了问题。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号