首页> 外文期刊>Circuits and Systems II: Express Briefs, IEEE Transactions on >Neural-Network Based Self-Initializing Algorithm for Multi-Parameter Optimization of High-Speed ADCs
【24h】

Neural-Network Based Self-Initializing Algorithm for Multi-Parameter Optimization of High-Speed ADCs

机译:基于神经网络的自初始化算法,用于高速ADC的多参数优化

获取原文
获取原文并翻译 | 示例
           

摘要

This brief proposes a new automatic model parameter selection approach for determining the optimal configuration of high-speed analog-to-digital converters (ADCs) using a combination of particle swarm optimization (PSO) and stochastic gradient descent (SGD) algorithm. The proposed hybrid method first initializes the PSO algorithm to search for optimal neural-network configuration via the particles moving in finite search space with coarse quantization. Using the PSO estimates, the SGD algorithm then finds the global optimum solution. The global search ability of the PSO algorithm and the local search ability of the SGD are thus exploited to determine an optimal solution that is close to the global optimum with reduced latency. Several experiments were constructed to optimize the non-linearities in Nyquist flash and pipeline ADC datasets to show that the neural networks trained by the PSO-SGD algorithm outperform the random search-based performance optimization. Comparative resource analysis of the proposed algorithm is also conducted against the state-of-the-art that highlights improved latencies and performance with similar area and implementation complexity.
机译:本简要提出了一种新的自动模型参数选择方法,用于使用粒子群优化(PSO)和随机梯度下降(SGD)算法的组合来确定高速模数转换器(ADC)的最佳配置。所提出的混合方法首先通过在具有粗量化的有限搜索空间中移动的粒子来搜索PSO算法以搜索最佳神经网络配置。使用PSO估计,SGD算法然后找到全局最佳解决方案。因此,利用SGD的PSO算法的全球搜索能力和SGD的本地搜索能力以确定接近全局最佳的最佳解决方案,降低延迟。构建了几个实验以优化奈奎斯特闪存和管道ADC数据集中的非线性,以表明由PSO-SGD算法训练的神经网络优于随机搜索的性能优化。该算法的比较资源分析还针对现有技术进行了突出显示改进的延迟和性能,具有相似的区域和实现复杂性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号