Speech Recognition using Biologically-Inspired Neural Networks
Thomas Bohnstingl, Ayush Garg, et al.
ICASSP 2022
Existing rigorous convergence guarantees for the Hamiltonian Monte Carlo (HMC) algorithm use Gaussian auxiliary momentum variables, which are crucially symmetrically distributed. We present a novel convergence analysis for HMC utilizing new dynamical and probabilistic arguments. The convergence is rigorously established under significantly weaker conditions, which among others allow for general auxiliary distributions. In our framework, we show that plain HMC with asymmetrical momentum distributions breaks a key self-adjointness requirement. We propose a modified version of HMC, that we call the Alternating Direction HMC (AD-HMC), which overcomes this difficulty. Sufficient conditions are established under which AD-HMC exhibits geometric convergence in Wasserstein distance. The geometric convergence analysis is extended to when the Hamiltonian motion is approximated by the leapfrog symplectic integrator, where an additional Metropolis–Hastings rejection step is required. Numerical experiments suggest that AD-HMC can generalize a popular dynamic auxiliary scheme to show improved performance over HMC with Gaussian auxiliaries.
Thomas Bohnstingl, Ayush Garg, et al.
ICASSP 2022
Kibichii Bore, Ravi Kiran Raman, et al.
ICBC 2019
Dian Balta, Mahdi Sellami, et al.
ePart 2021
Yidi Wu, Thomas Bohnstingl, et al.
ICML 2025