Skip to main content
Seminar | Mathematics and Computer Science

Kernel Methods are Competitive for Operator Learning

LANS Seminar

Abstract: We introduce a kernel-based framework for learning operators between Banach spaces. We show that even with simple kernels, our approach is competitive in terms of cost-accuracy trade-off and either matches or beats the performance of Neural Network methods on a majority of PDE-based benchmarks. Additionally, our framework offers several advantages inherited from kernel methods: simplicity, interpretability, convergence guarantees, a priori error estimates, and Bayesian UQ. It is, therefore, a natural benchmark for operator learning problems.

Bio: Matthieu Darcy is a second-year Ph.D. student in the Computing and Mathematical Sciences Department at Caltech. His research focuses on scientific machine learning, particularly on the application of kernel methods to partial differential equations, stochastic dynamical systems, and operator learning. Before joining Caltech, he obtained an MSc from Imperial College and a Master’s from ENS Paris-Saclay.