Generating reversiblizations of non-reversible Markov chains via information geometry, generalized mean and balancing function
64th ISI World Statistics Congress - Ottawa, Canada
Format: IPS Abstract
Keywords: bayesian-inference,, markov-chain, markovprocess
Session: IPS 170 - Advanced Bayesian Computation
Monday 17 July 10 a.m. - noon (Canada/Eastern)
Given a target distribution $\pi$ and a non-reversible Markov infinitesimal generator $L$ with respect to $\pi$ on a finite state space $\mathcal$, in this paper we develop three systematic and inter-related approaches to generate possibly new reversiblizations from $L$. The first approach hinges on an information-geometric perspective of reversiblizations, in which we view reversiblizations as information projections onto the space of $\pi$-reversible generator under suitable information divergence such as $f$-divergence. Different choices of $f$ allow us to recover almost all known reversiblizations while at the same time unravel and generate new reversiblizations. Along the way, we give interesting information-geometric results such as bisection property, Pythagorean identity, parallelogram law and a Markov chain counterpart of the arithmetic-geometric-harmonic mean inequality governing these reversiblizations. This also motivates us to introduce the notion of information centroids of a sequence of Markov chains and to give conditions for their existence and uniqueness. Building upon the first approach, we view reversiblizations as generalized means in the second approach, and construct new reversiblizations via different natural notions of generalized means such as the Cauchy mean or the dual mean. In the third approach, we combine the recently introduced locally-balanced Markov processes framework and the notion of convex $*$-conjugate in the study of $f$-divergence, which offers a rich source of balancing functions, to generate new reversiblizations. This is based on joint work with Geoff Wolfer (RIKEN AIP).