首页> 外文会议>Brazilian Symposium on Neural Networks >Global optimization methods for designing and training neural networks
【24h】

Global optimization methods for designing and training neural networks

机译:用于设计和培训神经网络的全局优化方法

获取原文

摘要

This paper shows results of two approaches for the optimization of neural networks: one uses simulated annealing for optimizing both architectures and weights combined with backpropagation for fine tuning, while the other uses tabu search for the same purpose. Both approaches generate networks with good generalization performance (mean classification error of 1.68% for simulated annealing and 0.64% for tabu search) and low complexity (mean number of connections of 11.15 out of 36 for simulated annealing and 11.62 out of 36 for tabu search) for an odor recognition task in an artificial nose.
机译:本文显示了两种用于优化神经网络的方法的结果:一种使用模拟退火来优化架构和重量与BackPropagation进行微调,而另一个使用Tabu搜索相同的目的。这两种方法都会生成具有良好普遍性性能的网络(模拟退火的平均分类误差为1.68%,对于禁忌搜索的0.64%)和低复杂度(用于模拟退火的36个中的平均连接为11.62,其中禁忌搜索中为36个)对于人造鼻子中的气味识别任务。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号