Variational Bayesian methods nevertheless suffer from slow convergence
when the variables
in the factored approximation
are actually strongly coupled in the original
model. The same problem arises in popular Gibbs
sampling algorithm. The sampling process converges
slowly in cases where the variables are strongly correlated.
The slow convergence can be alleviated by
data augmentation van Dyk and Meng (2001), where
the idea is to identify an optimal reparameterization
(within a family of possible reparameterizations) so as
to remove coupling. Similarly, in a deterministic context,
Liu et al. (1998) proposed over-parameterization
of the model to speed up EM convergence. Our work
here is inspired by Liu et al. (1998).
We propose Parameter-eXpanded Variational
Bayesian (PX-VB) method. The original model is
modified by auxiliary parameters that are optimized
in conjunction with the variational approximation.
The optimization of the auxiliary parameters corresponds
to a parameterized joint optimization of the
variational components; the role of the new updates
is to precisely remove otherwise strong functional
couplings between the components thereby facilitating
fast convergence.
Define a mapping M as a function between the old and new model
parameters. If the largest eigenvalue of the mapping introduced by
auxiliary variables is smaller
than 1, PX-VB converges faster than VB. The smaller the largest
eigenvalue of
this mapping, the faster PX-VB converges. This property can be verified
in each case
separately.