We consider the approximation of unknown or intractable integrals using quadrature when the evaluation of the integrand is considered costly. This is a central problem in machine learning, including model averaging, (hyper-)parameter marginalization, and computing posterior predictive distributions.
Recently Batch Bayesian Quadrature (BBQ) has combined the probabilistic integration techniques of Bayesian Quadrature with the parallelization techniques of Batch Bayesian Optimization, resulting in improved performance compared to Monte Carlo techniques, especially when parallelization is increased. While the selection of batches in BBQ mitigates costs of individual point selection, every point within every batch is nevertheless chosen serially, impeding the full potential of batch selection. We resolve this shortcoming.
We developed a novel BBQ method which updates points within a batch without the costs of non-serial point selection. To implement this, we devise a dynamic domain decomposition. Combining these efficiently reduces uncertainty, lowers error estimates of the integrand, and results in more numerically robust integral estimates. Furthermore, we close an open question about the cessation criteria, which we establish and support using numerical methods.
We present our findings within the context of the history of quadrature, show how our novel methods significantly improve the literature, and provide possibilities for future research.