Moderator: Shuo Wang
Speakers: Joe Mathews and Kevin Li
Presenter: Joseph Mathews
Title: "Finite sample complexity of sequential Monte Carlo estimators on multimodal target distributions"
Abstract: In this talk, I will discuss finite sample complexity bounds for sequential Monte Carlo algorithms which require only local mixing times of the associated Markov kernels. These bounds are particularly useful when the target distribution is multimodal and global mixing of the Markov kernels is slow; in such cases these bounds establish the benefits of sequential Monte Carlo over the corresponding Markov chain Monte Carlo estimator.
Presenter: Kevin Li
Title: Trigonometric Quadrature Fourier Features for Scalable Gaussian Process Regression
Abstract: Fourier feature approximations have been successfully applied in the literature for scalable Gaussian Process (GP) regression. In particular, Quadrature Fourier Features (QFF) derived from Gaussian quadrature rules have gained popularity in recent years due to their improved approximation accuracy and better calibrated uncertainty estimates compared to Random Fourier Feature (RFF) methods. However, a key limitation of QFF is that its performance can suffer from well-known pathologies related to highly oscillatory quadrature, resulting in mediocre approximation with limited features. We address this critical issue via a new Trigonometric Quadrature Fourier Feature (TQFF) method, which uses a novel non-Gaussian quadrature rule specifically tailored for the desired Fourier transform. We derive an exact quadrature rule for TQFF, along with kernel approximation error bounds for the resulting feature map. We then demonstrate the improved performance of our method over RFF and Gaussian QFF in a suite of numerical experiments and applications, and show the TQFF enjoys accurate GP approximations over a broad range of length-scales using fewer features.