Thomas Guilmeau : Solving divergence-minimization problems for variational inference and adaptive importance sampling

Title - HTML

Solving divergence-minimization problems for variational inference and adaptive importance sampling

Nom de l'orateur
Thomas Guilmeau
Etablissement de l'orateur
INRIA, UGA
Date et heure de l'exposé
12-02-2026 - 15:00:00
Lieu de l'exposé
salle de séminaire
Résumé de l'exposé

Many computational methods across Bayesian statistics aim at constructing parametric probability distributions with properties of interest. One can think of variational inference or adaptive importance sampling for instance. The construction of such distributions can often be formulated as the minimization of a statistical divergence, such as the Kullback-Leibler divergence, over a family of approximating distributions. Depending on the choice of statistical divergence and approximating distributions, the resulting divergence-minimization problem may promote solutions with different features and may be more or less hard to solve. For instance, some problems are convex, promote mass-covering approximations, but may induce hard-to-approximate gradients, while other are non-convex, promote mode-seeking approximating distributions, and have easy-to-approximate gradients.

In this talk, I will discuss some considerations related to the choice of statistical divergence and approximating distributions and present recent results and algorithms to solve some particular classes of divergence-minimization problems. These results leverage tools from stochastic optimization and convex analysis that allow to capture the geometric features of divergence-minimization problems.

comments