
Research
Abstracts  2007 
Parameter Expanded Variational Bayesian MethodsYuan (Alan) Qi & Tommi S. JaakkolaSummary:Bayesian inference has become increasingly important in statistical machine learning. Exact Bayesian calculations are often not feasible in practice, however. A number of approximate Bayesian methods have been proposed to make such calculations practical, among them the variational Bayesian (VB) approach. The VB approach, while useful, can nevertheless suffer from slow convergence to the approximate solution. To address this problem, we propose ParametereXpanded Variational Bayesian (PXVB) methods to speed up VB. The new algorithm is inspired by parameterexpanded expectation maximization (PXEM) and parameterexpanded data augmentation (PXDA). Similar to PXEM and DA, PXVB expands a model with auxiliary variables to reduce the coupling between variables in the original model. We analyze the convergence rates of VB and PXVB and demonstrate the superior convergence rates of PXVB in variational probit regression and automatic relevance determination. Results:Comparison between VB and PXVB for probit regression on synthetic (a) and kidneybiospy data sets (b). PXVB converges significantly faster than VB. Note that the Y axis shows the difference between two consecutive estimates of the posterior mean of the parameter w. Reference:Parameter Expanded Variational Bayesian Methods, Y. Qi and T.S. Jaakkola, in Advances in Neural Information Processing Systems 19, MIT Press, Cambridge, MA, 2007. Link 

