The use of Radial Basis Functions in supervised learning is well motivated by approximation theory. Computation issues have lead to consider some approximations of this scheme, loosing much of the mathematical foundation in the process. We showhere that basis pursuit denoising is a principled alternative to classical RBF, which leads to sparse expansions. This alternative is local in the sense that complexity is tuned locally. A further step in this direction is made by adapting the localityparameter of each basis function. The algorithm proposed to solve this problem is simple, and the resulting solution, although extremely flexible, is governed by a single hyperparameter.
展开▼