...
首页> 外文期刊>IEEE Transactions on Signal Processing >Transfer Learning in Adaptive Filters: The Nearest Instance Centroid-Estimation Kernel Least-Mean-Square Algorithm
【24h】

Transfer Learning in Adaptive Filters: The Nearest Instance Centroid-Estimation Kernel Least-Mean-Square Algorithm

机译:自适应滤波器中的转移学习:最近实例质心估计核最小均方算法

获取原文
获取原文并翻译 | 示例
           

摘要

We propose a novel nearest-neighbors approach to organize and curb the growth of radial basis function network in kernel adaptive filtering (KAF). The nearest-instance-centroid-estimation (NICE) kernel least-mean-square (KLMS) algorithm provides an appropriate time-space tradeoff with good performance. Its centers in the input/feature space are organized by quasi-orthogonal regions for greatly simplified filter evaluation. Instead of using all centers to evaluate/update the function approximation at every new point, a linear search among the iteratively-updated centroids determines the partial function to be used, naturally forming locally-supported partial functionals. Under this framework, partial functionals that compose the adaptive filter are quickly stored/retrieved based on input, each corresponding to a specialized “spatial-band” subfilter. The filter evaluation becomes the update of one of the subfilters, creating a content addressable filter bank (CAFB). This CAFB is incrementally updated for new signal applications with mild constraints, always using the past-learned partial filter sums, opening the door for transfer learning and significant efficiency for new data scenarios, avoiding training from scratch as have been done since the invention of adaptive filtering. Using energy conservation relation, we show the sufficient condition for mean square convergence of the NICE-KLMS algorithm and establish the upper and lower bounds of steady-state excess-mean-square-error (EMSE). Simulations on chaotic time-series prediction demonstrate similar levels of accuracy as existing methods, but with much faster computation involving fewer input samples. Simulations on transfer learning using both synthetic and real-world data demonstrate that NICE CAFB can leverage previously learned knowledge to related task or domain.
机译:我们提出了一种新颖的最近邻方法来组织和抑制核自适应滤波(KAF)中径向基函数网络的增长。最近实例质心估计(NICE)内核最小均方(KLMS)算法提供了具有良好性能的适当时空折衷。其输入/特征空间的中心由准正交区域组织,以大大简化滤波器的评估。取代使用所有中心来评估/更新每个新点处的函数近似,在迭代更新的质心之间进行线性搜索可以确定要使用的部分函数,​​从而自然形成了局部支持的部分函数。在此框架下,构成自适应滤波器的部分功能会根据输入快速存储/检索,每个功能都对应于专门的“空间带”子滤波器。过滤器评估成为子过滤器之一的更新,从而创建了内容可寻址过滤器库(CAFB)。对于具有轻度约束的新信号应用,此CAFB会进行增量更新,始终使用过去学习的部分滤波器总和,为新数据场景的传输学习和显着效率打开了大门,避免了自发明自适应以来从头开始的训练过滤。利用能量守恒关系,我们证明了NICE-KLMS算法均方收敛的充分条件,并确定了稳态超均方误差(EMSE)的上限和下限。混沌时间序列预测的仿真显示出与现有方法相似的准确性,但是计算速度更快,涉及的输入样本更少。使用合成数据和实际数据对转移学习进行的模拟表明,NICE CAFB可以将先前学习的知识用于相关任务或领域。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号