64th ISI World Statistics Congress - Ottawa, Canada

64th ISI World Statistics Congress - Ottawa, Canada

MixFlows: Principled Variational Bayesian Inference via Approximately Measure-Preserving Maps

Abstract

This talk will introduce mixed variational flows (MixFlows), a new variational family for Bayesian inference that consists of a mixture
of pushforwards of an initial reference distribution under repeated applications of a map. Like most variational families, MixFlows enable efficient i.i.d. sampling,
density evaluation, and unbiased ELBO estimation. But unlike other families, MixFlows enable MCMC-like convergence guarantees; and crucially, these guarantees
hold without the need to solve any nonconvex optimization problem. In particular, we show that when the flow map is ergodic and measure-preserving, MixFlow
distributions converge to the target distribution. We also provide bounds on the accumulation of error in practical implementations where the flow map is approximated.
Finally, we provide an implementation of MixFlows based on uncorrected discretized Hamiltonian dynamics combined with deterministic momentum refreshment. Simulated
and real data experiments show that MixFlows can provide more reliable posterior approximations than several black-box normalizing flows, as well as samples of comparable quality
to those obtained from state-of-the-art MCMC methods.