Anna Korba, UCL/ENSAE, le 2 avril 2021

A Non Asymptotic Analysis of Stein Variational Gradient Descent
mardi 30 mars 2021
par  Alain Celisse

We study the Stein Variational Gradient Descent (SVGD) algorithm, which optimises a set of particles to approximate a target probability distribution π∝exp(−V) on ℝ^d. In the population limit, SVGD performs gradient descent in the space of probability distributions on the KL divergence with respect to π, where the gradient is smoothed through a kernel integral operator. In this paper, we provide a novel finite time analysis for the SVGD algorithm.
We provide a descent lemma establishing that the algorithm decreases the objective at each iteration, and rates of convergence in terms of the Kernel Stein Discrepancy.
We also provide a convergence result of the finite particle system (corresponding to the practical implementation of SVGD) to its population version.