The Bayes central limit theorem (BCLT) for finite-dimensional models, also known as Bernstein (von Mises theorem), is the main motivation for the widely used Laplace approximation. However, BCLT is currently only expressed in terms of total variation (TV) distance and lacks a non-asymptotic bound on the rate of convergence that can be easily computed by the application. Similarly, the Laplace approximation does not provide a non-asymptotic quality guarantee for a large class of asymptotically valid posterior distributions. To understand its quality and applicability to real problems, we need a finite sample bound that can be computed for a given model and dataset. Also, to understand the quality of the posterior mean and variance estimation, we need the range of divergence instead of the TV distance. Our work provides the first closed-form finite-sample bound for the quality of the Laplace approximation that does not require the log-concavity of the posterior probabilities or the exponential family likelihood. We restricted not only the TV distance, but also (A) the Wasserstein-1 distance, which controls the error of the posterior mean estimate, and (B) the integral probability metric, which controls the error of the posterior variance estimate. Calculate exact constants within bounds for various standard models, including logistic regression, and numerically investigate the usefulness of bounds. It also provides a framework for analyzing more complex models.

Source link


Leave A Reply