首页> 外文期刊>Signal Processing, IEEE Transactions on >Active Learning and Basis Selection for Kernel-Based Linear Models: A Bayesian Perspective
【24h】

Active Learning and Basis Selection for Kernel-Based Linear Models: A Bayesian Perspective

机译:基于核的线性模型的主动学习和基础选择:贝叶斯观点

获取原文
获取外文期刊封面目录资料

摘要

We develop an active learning algorithm for kernel-based linear regression and classification. The proposed greedy algorithm employs a minimum-entropy criterion derived using a Bayesian interpretation of ridge regression. We assume access to a matrix, ${bf Phi}in{BBR}^{Ntimes N}$, for which the $(i,j)$th element is defined by the kernel function $K(gamma_i,gamma_j)in{BBR}$, with the observed data $gamma_iin{BBR}^d$. We seek a model, ${cal M}:gamma_irightarrow y_i$, where $y_i$ is a real-valued response or integer-valued label, which we do not have access to a priori. To achieve this goal, a submatrix, ${bf Phi}_{I_l,I_b} in{BBR}^{ntimes m}$, is sought that corresponds to the intersection of $n$ rows and $m$ columns of ${bf Phi}$, indexed by the sets $I_l$ and $I_b$, respectively. Typically $mll N$ and $nll N$. We have two objectives: $(i)$ Determine the $m$ columns of ${bf Phi}$, indexed by the set $I_b$, that are the most informative for building a linear model, ${cal M}: [1 {bf Phi}_{i,I_b}]^T rightarrow y_i$ , without any knowledge of ${y_i}_{i=1}^N$ and $(ii)$ using active learning, sequentially determine which subset of $n$ elements of ${y_i}_{i=1}^N$ should be acquired; both stopping values, $vert I_bvert = m$ and
机译:我们为基于核的线性回归和分类开发了一种主动学习算法。所提出的贪婪算法采用了最小熵准则,该准则是使用贝叶斯对岭回归的解释得出的。我们假设访问矩阵$ {bf Phi} in {BBR} ^ {Ntimes N} $,为此,第($(i,j)$ th个元素由内核函数$ K(gamma_i,gamma_j)in { BBR} $,以及观察到的数据$ gamma_iin {BBR} ^ d $。我们寻求模型$ {cal M}:gamma_irightarrow y_i $,其中$ y_i $是实值响应或整数值标签,我们无法获得先验条件。为了实现此目标,寻求一个子矩阵,$ {bf Phi} _ {I_1,I_b} in {BBR} ^ {ntimes}},该子矩阵对应于$ {$ n $行和$ m $列的交集。 bf Phi} $,分别由集合$ I_1 $和$ I_b $索引。通常$ mll N $和$ nll N $。我们有两个目标:$(i)$确定由集合$ I_b $索引的$ {bf Phi} $的$ m $列,它们对于建立线性模型$ {cal M}最有用: 1 {bf Phi} _ {i,I_b}] ^ T rightarrow y_i $,不使用主动学习对$ {y_i} _ {i = 1} ^ N $和$(ii)$有任何了解,则依次确定应该获取$ {y_i} _ {i = 1} ^ N $的$ n $个元素;两个终止值,$ vert I_bvert = m $和<公式f

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号