首页> 外文会议>IEEE Congress on Evolutionary Computation >Particle Swarm optimisation for Evolving Deep Neural Networks for Image Classification by Evolving and Stacking Transferable Blocks
【24h】

Particle Swarm optimisation for Evolving Deep Neural Networks for Image Classification by Evolving and Stacking Transferable Blocks

机译:进化和堆叠可移动块对进化神经网络进行图像分类的粒子群优化

获取原文

摘要

Deep Convolutional Neural Networks (CNNs) have been widely used in image classification tasks, but the process of designing CNN architectures is very complex, so Neural Architecture Search (NAS), automatically searching for optimal CNN architectures, has attracted more and more research interests. However, the computational cost of NAS is often too high to be applied to real-life applications. In this paper, an efficient particle swarm optimisation method named EPSOCNN is proposed to evolve CNN architectures inspired by the idea of transfer learning. EPSOCNN successfully reduces the computation cost by minimising the search space to a single block and utilising a small subset of the training set to evaluate CNNs during the evolutionary process. Meanwhile, EPSOCNN also keeps very competitive classification accuracy by stacking the evolved block multiple times to fit the whole training dataset. The proposed EPSOCNN algorithm is evaluated on CIFAR-10 dataset and compared with 13 peer competitors including deep CNNs crafted by hand, learned by reinforcement learning methods and evolved by evolutionary computation approaches. It shows very promising results with regard to the classification accuracy, the number of parameters and the computational cost. Besides, the evolved transferable block from CIFAR-10 is transferred and evaluated on two other datasets — CIFAR-100 and SVHN. It shows promising results on both of the datasets, which demonstrate the transferability of the evolved block. All of the experiments have been performed multiple times and Student’s t-test is used to compare the proposed method with peer competitors from the statistical point of view.
机译:深度卷积神经网络(CNN)已广泛用于图像分类任务,但是CNN体系结构的设计过程非常复杂,因此自动搜索最佳CNN体系结构的神经体系结构搜索(NAS)引起了越来越多的研究兴趣。但是,NAS的计算成本通常太高,无法应用于现实生活中的应用程序。在本文中,提出了一种有效的粒子群优化方法EPSOCNN,以发展基于迁移学习思想的CNN体​​系结构。 EPSOCNN通过将搜索空间最小化到单个块并利用训练集的一小部分来评估进化过程中的CNN,成功地降低了计算成本。同时,EPSOCNN还通过多次堆叠进化后的块以适应整个训练数据集,保持非常有竞争力的分类准确性。提出的EPSOCNN算法在CIFAR-10数据集上进行了评估,并与13个同行竞争对手进行了比较,其中包括手工制作的深层CNN,通过强化学习方法学习并通过进化计算方法演化而成。在分类精度,参数数量和计算成本方面,它显示出非常有希望的结果。此外,从CIFAR-10演变而来的可转让区块被转移并在另外两个数据集CIFAR-100和SVHN上进行了评估。它在两个数据集上均显示出令人鼓舞的结果,证明了演化区块的可转移性。所有实验均已进行了多次,并且使用Student's t检验从统计学的角度将提议的方法与同业竞争者进行了比较。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号