Abstract

Error estimates for kernel interpolation in Reproducing Kernel Hilbert Spaces usually assume quite restrictive properties on the shape of the domain, especially in the case of infinitely smooth kernels like the popular Gaussian kernel. In this paper we prove that it is possible to obtain convergence results (in the number of interpolation points) for kernel interpolation for arbitrary domains |$\varOmega \subset{\mathbb{R}} ^{d}$|⁠, thus allowing for non-Lipschitz domains including e.g., cusps and irregular boundaries. Especially we show that, when going to a smaller domain |$\tilde{\varOmega } \subset \varOmega \subset{\mathbb{R}} ^{d}$|⁠, the convergence rate does not deteriorate—i.e., the convergence rates are stable with respect to going to a subset. We obtain this by leveraging an analysis of greedy kernel algorithms. The impact of this result is explained on the examples of kernels of finite as well as infinite smoothness. A comparison to approximation in Sobolev spaces is drawn, where the shape of the domain |$\varOmega $| has an impact on the approximation properties. Numerical experiments illustrate and confirm the analysis.

This article is published and distributed under the terms of the Oxford University Press, Standard Journals Publication Model (https://academic.oup.com/journals/pages/open_access/funder_policies/chorus/standard_publication_model)
You do not currently have access to this article.