Validating and Expanding Approximate Bayesian Computation Methods (17w5025)
Christian Robert (Universite Paris Dauphine)
Luke Bornn (Simon Fraser University)
Gael Martin (Monash University)
Jukka Corander (University of Helskinki)
Dennis Prangle (University of Newcastle)
Richard Wilkinson (University of Sheffield)
While there has been workshops (as well as sessions of larger conferences like MCMSki) on ABC methods, those meetings have only concentrated on the growing range of applications of the methodology. We propose to gather experts on the theoretical properties of ABC methods, which are as yet little understood, and on the methodological roadblocks currently in the way of broad-scale adoption of ABC (e.g., high dimensional inputs, big data, dealing with computation constraints etc). We feel that such a workshop will benefit the fields where ABC potentially applies and the research in computational statistical methods, since ABC is still perceived by many as a fringe topic, a surrogate approach to last only until a ``better" solution is discovered. In addition, we think it will increase the awareness of ABC as a mainstream statistical approach across the North American statistical community.
While the ABC method naturally pertains to the class of Monte Carlo methods, being based on simulations, it also relates to the greater field of approximative models, from which too little has been extracted so far. Speeds of convergence, choice of calibration factors, assessments of reliability, all are still under exploration and we aim the workshop at bringing a more coherent, unified and appealing image of the validity of those methods (which, we recall, have already seen significant application across a broad range of applications).
We also expect the group of experts gathered therein will discuss the stumbling block of big data in ABC settings, since the ability to operate ABC degrades with both the size of the data (since it needs to be simulated) and the dimension of the parameter driving the statistical model (since the distance between estimators automatically increases). When the size of the data or the complexity of the statistical likelihood prohibits algorithms that operate on the whole model at once, even with ABC solutions, this requires techniques that split and merge partial calculations exploiting only parts of the models in the most efficient manner; instances are pre-fetching, collaborative and median mixing, bag of little bootstraps, all proposed in the recent past. Introducing a second level of approximation via the use of insufficient statistics and other information-deprived tools may be the pathway to handle big data and large dimensions, however such solutions must come with theoretical guarantees and implementation guidelines.
Statement of objectives
The aim of the workshop is to gather an audience of experts made of statisticians and of machine-learning experts who are using and developing computational methods, and of applied mathematicians and computer-scientists studying approximation techniques, towards a clearer picture of the challenges and research directions of the convergence of ABC methods.
The themes that will be central to the workshop are:
• extended theoretical analysis of ABC algorithms (e.g., big data asymptotics, optimal convergence speeds, counter examples and benchmarks);
• novel algorithms to reduce the number of simulations required by ABC (Rao-Blackwellisation, surrogate models, semi-parametric representations, summary simulations);
• novel approaches to calculating distances in ABC (non-Euclidean distances, coding distance, distance models, estimation of the ABC error);
• specifics of ABC model choice, from consistency to novel inference methods
• methods to tackle the ABC curse of dimensionality issue (partial simulations, graph decompositions marginalisation, variational approximations);
• parallel ABC methods (stopping rules, stopping, convergence assessment, data subsampling);
• subsampling ABC representations (merging partial explorations, assessing the information loss);
• the special case of intractable constants in both statistics and physics, where parts of the likelihood are results of intractable integrals, with an ever increasing array of solutions tackling special cases with possible encompassing in a more global perspective;
A proactive effort will be made to achieve a significant presence of younger and under-represented researchers at this workshop. For one thing, it is paramount and enriching that young researchers from our areas are exposed from the start to emerging domains of interest, to intimate research workshops and to debates about state-of-the-art research. We believe that such low stress interactions between generations of researchers is one of the most efficient approaches to future research of all participants. Furthermore, the recent developments in ABC theory and methodology have mostly been induced by young researchers, most of whom are included in the list below. Given the obvious fact that the youngest researchers of 2017 are not all yet at a detectable position, we believe that the various institutions of the mature researchers involved in this workshop will be very open to our request of involving some of their students and postdocs at the time.