No CrossRef data available.
Published online by Cambridge University Press: 19 August 2025
Let $\pi$ be a probability distribution in
$\mathbb{R}^d$ and f a test function, and consider the problem of variance reduction in estimating
$\mathbb{E}_\pi(f)$. We first construct a sequence of estimators for
$\mathbb{E}_\pi (f)$, say
$({1}/{k})\sum_{i=0}^{k-1} g_n(X_i)$, where the
$X_i$ are samples from
$\pi$ generated by the Metropolized Hamiltonian Monte Carlo algorithm and
$g_n$ is the approximate solution of the Poisson equation through the weak approximate scheme recently invented by Mijatović and Vogrinc (2018). Then we prove under some regularity assumptions that the estimation error variance
$\sigma_\pi^2(g_n)$ can be as arbitrarily small as the approximation order parameter
$n\rightarrow\infty$. To illustrate, we confirm that the assumptions are satisfied by two typical concrete models, a Bayesian linear inverse problem and a two-component mixture of Gaussian distributions.