Schedule for: 18w5023 - Computational Statistics and Molecular Simulation: A Practical Cross-Fertilization

Beginning on Sunday, November 11 and ending Friday November 16, 2018

All times in Oaxaca, Mexico time, CST (UTC-6).

Sunday, November 11
14:00 - 23:59 Check-in begins (Front desk at your assigned hotel)
19:30 - 22:00 Dinner (Restaurant Hotel Hacienda Los Laureles)
20:30 - 21:30 Informal gathering (Hotel Hacienda Los Laureles)
Monday, November 12
07:30 - 08:45 Breakfast (Restaurant at your assigned hotel)
08:45 - 09:00 Introduction and Welcome (Conference Room San Felipe)
09:00 - 09:30 Jesús María Sanz-Serna: Numerical integration within the Hamiltonian (Hybrid) Monte Carlo method
The Hamiltonian or Hybrid Monte Carlo (HMC) method is a valuable sampling algorithm used in both molecular dynamics and statistics. Its efficiency very much depends on the numerical integration of the dynamics employed to define the proposal, with Verlet/leapfrog being the algorithm of choice. In the talk I will discuss how different properties of the integrator impact the efficiency of HMC and how to construct novel integrators that significantly improve on Verlet. I will also comment on adaptive integration strategies recently incorporated to some well-known molecular dynamics packages.
(Conference Room San Felipe)
09:30 - 10:00 Gael Martin: Loss-based Bayesian Prediction
Bayesian predictive distributions quantify uncertainty about unknown (or out-of-sample) values of a random process conditional only on observed data; uncertainty regarding model-specific parameters being integrated out via the usual probability calculus. Uncertainty about the assumed model itself can, in turn, be accommodated via model-averaging, with the implicit assumption being that the true data generating process (DGP) is contained in the set over which the averaging occurs. We move away from this so-called $\mathcal{M}$-closed world, in which the true DGP is assumed to be either known with certainty, or known to lie in a finite set of models. Operating with an $\mathcal{M}$-open view of the world, we {instead construct} Bayesian predictives, not based upon a given model, or set of models, but by adopting a user-supplied concept of predictive performance loss. To develop such machinery in the Bayesian paradigm, we rely on the principles underlying approximate Bayesian computation. Specifically, construction of prediction distributions is carried out using simulation, summary statistics that minimize predictive loss over a pre-specified training period, and a tolerance level that captures our risk aversion to predictive loss. Different methods of defining predictive loss are explored, including one in which performance is benchmarked against a parsimonious 'auxiliary predictive'. Comparison with the `exact' Bayesian predictive based on a mis-specified DGP is undertaken, in simulation scenarios. Using this new approach, we also propose a robust diagnostic procedure for detecting model mis-specification. Empirical illustrations using both time series and cross-sectional data are provided.
(Conference Room San Felipe)
10:00 - 10:30 Florian Maire: Metastability in molecular dynamics and Bayesian inference methods
Metastability is a phenomenon that concerns both molecular dynamic (and in particular protein dynamic) and Bayesian inference, especially when the posterior distribution of interest is multimodal. Restricting our focus to Markov processes and Markov chains, we explore this duality and study how Bayesian inference methods can benefit from practices developed in protein dynamic analysis and conversely. Questions of interest include the characterization and consequences of metastability in both perspectives. A particular attention will be given to distinguish between reversible and non-reversible processes/chains.
(Conference Room San Felipe)
10:30 - 11:00 Coffee Break (Conference Room San Felipe)
11:00 - 13:00 Arthur Voter: (Hands-on + discussion) Molecular dynamics in materials science
I will present an introduction to the molecular dynamics method as it is used in materials science simulations. Through a few simple examples, I will give a sense of how one sets up and carries out such simulations, and some of the issues that arise. One focus will be on the time-scale problem; accurate integration of the classical equations of motion limits simulations to times that are often much shorter than the time scales of interest in the physical problem. In this regard, metastability both contributes to this problem and offers opportunities for solving it.
(Conference Room San Felipe)
13:20 - 13:30 Group Photo (Hotel Hacienda Los Laureles)
13:30 - 15:00 Lunch (Restaurant Hotel Hacienda Los Laureles)
16:00 - 16:30 Coffee Break (Conference Room San Felipe)
16:30 - 17:00 Angela Bitto-Nemling: Time Varying Parameter Mixture Model
(Joint work with Sylvia Fruehwirth-Schnatter) We introduce the TVP (Time Varying Parameter) Mixture Model. Based on previous work (Bitto and Fruehwirth-Schnatter, 2017), the focus of this paper is the estimation of a time-varying parameter model with shrinkage priors. The key idea is the usage of spike-and-slab priors for the process variances. We assume that both spike and slab have a hierarchical representation as a normal-gamma prior (Griffin and Brown,2010). In this way we extend previous work based on spike-and-slab priors (Fruehwirth-Schnatter and Wagner, 2010) and Bayesian Lasso type priors (Belmonte et al. 2014). We present necessary modifications of our efficient MCMC estimation scheme, exploiting ideas such as ancillarity-sufficiency interweaving (Yu and Meng, 2011). We present our idea with a simulation study.
(Conference Room San Felipe)
17:00 - 17:30 Yang Chen: Determine the Number of States in Hidden Markov Models via Marginal Likelihood
Hidden Markov models (HMM) have been widely adopted by scientists from various fields to model stochastic systems: the underlying process is a discrete Markov chain and the observations are noisy realizations of the underlying process. Determining the number of hidden states for an HMM is a model selection problem, which has yet to be satisfactorily solved, especially for the popular Gaussian HMM with heterogeneous covariance. In this paper, we propose a consistent method for determining the number of hidden states of HMM based on the marginal likelihood, which is obtained by integrating out both the parameters and hidden states. Moreover, we show that the model selection problem of HMM includes the order selection problem of finite mixture models as a special case. We give a rigorous proof of the consistency of the proposed marginal likelihood method and provide simulation studies to compare the proposed method with the currently mostly adopted method, the Bayesian information criterion (BIC), demonstrating the effectiveness of the proposed marginal likelihood method.
(Conference Room San Felipe)
17:30 - 18:00 Tony Lelievre: Hybrid Monte Carlo methods for sampling probability measures on submanifolds
Probability measures supported on submanifolds can be sampled by adding an extra momentum variable to the state of the system, and discretizing the associated Hamiltonian dynamics with some stochastic perturbation in the extra variable. In order to avoid biases in the invariant probability measures sampled by discretizations of these stochastically perturbed Hamiltonian dynamics, a Metropolis rejection procedure can be considered. The so-obtained scheme belongs to the class of generalized Hybrid Monte Carlo (GHMC) algorithms. We show how to generalize to GHMC a procedure suggested by Goodman, Holmes-Cerfon and Zappa for Metropolis random walks on submanifolds, where a reverse projection check is performed to enforce the reversibility of the algorithm for large timesteps and hence avoid biases in the invariant measure. We also provide a full mathematical analysis of such procedures, as well as numerical experiments demonstrating the importance of the reverse projection check on simple toy examples. This a joint work with M. Rousset and G. Stoltz
(Conference Room San Felipe)
18:00 - 18:30 Josh Fass: Towards Bayesian parameterization of implicit solvent models
Implicit solvent models can be much faster to simulate than explicit solvent, but they are less accurate. They also have more free parameters than explicit water models (such as TIP4P). Implicit solvent models typically represent solvent-solute interactions using continuous regions of high and low dielectric constant, separated by a parameterized surface. This surface is parameterized by terms such as the effective Born radii of the atoms in the solute. These effective radii are in turn assigned by matching simple patterns to the solute's chemical graph (e.g. assigning different effective radii to carbon atoms depending on whether the carbon atom is bound to oxygen vs. hydrogen). We would like to infer these effective radii automatically, and propagate any parameter uncertainty into subsequent predictions. To this end, we parameterize a Generalized Born implicit solvent model using FreeSolv, a database of experimentally measured hydration free energies. We use Reversible Jump Markov Chain Monte Carlo to sample jointly over the continuous parameters (such as effective radii), as well as the discrete "atom-type" definitions used by a GB model, with the aim of automatically selecting an appropriate level of model complexity.
(Conference Room San Felipe)
19:00 - 21:00 Dinner (Restaurant Hotel Hacienda Los Laureles)
Tuesday, November 13
07:30 - 09:00 Breakfast (Restaurant at your assigned hotel)
09:00 - 09:30 Carsten Hartmann: Duality of estimation and control and its application to rare event simulation
Many complex systems studied by scientists and engineers are characterised by processes that take place on vastly different time scales. Often the interesting system behaviour, such as phase transitions or regime changes, happens on the longest time scales, and the precise statistical estimation of these slow processes or associated rare events is among the most challenging computational problems in science and engineering. The talk will be devoted to the question of computing the optimal change of measure for certain classes of rare event simulation problems that appear in statistical mechanics, e.g. in molecular dynamics. The method is based on a representation of the rare event sampling problem as an equivalent (or: dual) stochastic optimal control problem, whose value function characterizes the optimal (i.e. minimum variance) change of measure. The specific duality behind the problem is then used to devise numerical algorithms for computing the optimal change of measure where I will describe two approaches in some detail that are built on a semi-parametric representation of the value function: a cross-entropy based stochastic approximation algorithm for the optimal change of measure and a novel approach based on the formulation of the optimality conditions in form of a forward-backward stochastic differential equation. I will discuss the general approach, with a particular focus on the choice of the ansatz functions and the solution of high-dimensional problems, and illustrate the numerical method with simple toy examples.
(Conference Room San Felipe)
09:30 - 10:00 Murray Pollock: Confusion: Developing an information-theoretic secure approach for multiple parties to pool and unify statistical data, distributions and inferences
“Monte Carlo Fusion” (Dai, Pollock, Roberts, 2018, JAP) is a new theory providing a framework for the unification of distributed statistical analyses and inferences, into a single coherent inference. This problem arises in many settings (for instance, expert elicitation, multi-view learning, distributed ‘big data’ problems etc.). Monte Carlo Fusion is the first general statistical approach which avoids any form of approximation error in obtaining the unified inference, and so has broad applicability across a number of statistical applications. A direction of particular interest for broad societal impact is in Statistical Cryptography. Considering the setting in which multiple (potentially untrusted) parties wish to securely share distributional information (for instance in insurance, banking and social media settings), Fusion methodology offers the possibility that distributional sharing can be conducted in such a manner that the information which is required to be exchanged between the parties for the methodology can be secretly shared. As a consequence a gold-standard information theoretic security of the raw data can be achieved. So called “Confusion”, a confidential fusion approach to statistical secret sharing, has the property that another party with unbounded compute power could not determine secret information of any other party. Joint work with Louis Aslett, Hongsheng Dai, Gareth Roberts.
(Conference Room San Felipe)
10:00 - 10:30 Alain Durmus: Analysis of Langevin Monte-Carlo via convex optimization
We provide new insights on the Unadjusted Langevin Algorithm. We show that this method can be formulated as a first order optimization algorithm of an objective functional defined on the Wasserstein space of order 2. Using this interpretation and techniques borrowed from convex optimization, we give a non-asymptotic analysis of this method to sample from log concave smooth target distribution. Our proofs are then easily extended to the Stochastic Gradient Langevin Dynamics, which is a popular extension of the Unadjusted Langevin Algorithm. Finally, this interpretation leads to a new methodology to sample from a non-smooth target distribution, for which a similar study is done.
(Conference Room San Felipe)
10:30 - 11:00 Coffee Break (Conference Room San Felipe)
11:00 - 13:00 Gareth 0. Roberts: (Hands-on + discussion) Scaling limits for modern MCMC algorithms
The presentation will review results on infinite dimensional limits for some modern MCMC algorithms with a particular focus on Piecewise Deterministic Markov Processes PDMPs. The talk will also discuss the methodological consequences of these results for MCMC implementation. For certain stylised sequences of target density and particular MCMC algorithms, limit results can be obtained as the dimension of the target diverges. For traditional (Metropolis-Hastings type) MCMC algorithms, such limits are typically (but not always) diffusions. For non-reversible alternatives such as PDMPs similar results can be obtained, although often not of diffusion form and not even Markov. The two simplest PDMP strategies, Zig-Zag and the Bouncy Particle Sampler (BPS) can be readily analysed with some surprising conclusions; not least that the BPS has some undesirable asymptotic reducibility properties as dimension diverges. Most of the results in this area assume stationarity, but work on the transient phase will also be at least briefly described.
(Conference Room San Felipe)
13:30 - 15:00 Lunch (Restaurant Hotel Hacienda Los Laureles)
16:00 - 16:30 Coffee Break (Conference Room San Felipe)
16:30 - 17:00 Jeremy Heng: Gibbs flow transport for Bayesian inference
In this work, we consider the construction of transport maps between two distributions using flows. In the Bayesian formalism, this ordinary differential equation approach is natural when one introduces a curve of distributions that connects the prior to posterior by tempering the likelihood. We present a novel approximation of the resulting partial differential equation which yields an ordinary differential equation whose drift depends on the full conditional distributions of the posterior. We discuss properties of the Gibbs flow and efficient implementation in practical settings when employing the flow as proposals within sequential Monte Carlo samplers. Gains over state-of-the-art methods at a fixed computational complexity will be illustrated on a variety of applications.
(Conference Room San Felipe)
17:00 - 17:30 Jonathan Weare: Fast randomized iterative numerical linear algebra for quantum chemistry and other applications
I will discuss a family of recently developed stochastic techniques for linear algebra problems involving massive matrices. These methods can be used to, for example, solve linear systems, estimate eigenvalues/vectors, and apply a matrix exponential to a vector, even in cases where the desired solution vector is too large to store. The first incarnations of this idea appear for dominant eigenproblems arising in statistical physics and in quantum chemistry and were inspired by the real space diffusion Monte Carlo algorithm which has been used to compute chemical ground states for small systems since the 1970's. I will discuss our own general framework for fast randomized iterative linear algebra as well share a (very partial) explanation for their effectiveness. I will also report on the progress of an ongoing collaboration aimed at developing fast randomized iterative schemes specifically for applications in quantum chemistry. This talk is based on joint work with Lek-Heng Lim, Timothy Berkelbach, and Sam Greene.
(Conference Room San Felipe)
17:30 - 18:00 Matt Moores: Sequential Monte Carlo for Bayesian Analysis of Spectroscopy
The spectral signature of a molecule can be predicted using a quantum-mechanical model, such as time-dependent density functional theory (TD-DFT). However, there are no uncertainty estimates associated with these predictions, and matching with peaks in observed spectra is often performed by eye. This talk introduces a model-based approach for baseline estimation and peak fitting, using TD-DFT predictions as an informative prior. The peaks are modelled as a mixture of Lorentzian, Gaussian, or pseudo-Voigt broadening functions, while the baseline is represented as a penalised cubic spline. We fit this model using a sequential Monte Carlo (SMC) algorithm, which is robust to local maxima and enables the posterior distribution to be incrementally updated as more data becomes available. We apply our method to multivariate calibration of Raman-active dye molecules, enabling us to estimate the limit of detection (LOD) of each peak. This is joint work with Mark Girolami & Jake Carson (U. Warwick), and Karen Faulds, Duncan Graham & Kirsten Gracie (U. Strathclyde)
(Conference Room San Felipe)
18:00 - 18:30 Alessandra Iacobucci: Thermal transport in one dimensional chains (Conference Room San Felipe)
19:00 - 21:00 Dinner (Restaurant Hotel Hacienda Los Laureles)
Wednesday, November 14
07:30 - 09:00 Breakfast (Restaurant at your assigned hotel)
09:00 - 09:30 Jianfeng Lu: Path integral molecular dynamics
In this talk, we will discuss some recent works on sampling of thermal average of quantum systems based on ring polymer representations. We will discuss 1) path integral molecular dynamics with surface hopping for multi-level quantum systems; (2) sampling schemes motivated by the continuum limit of the ring polymer representations. (joint work with Zhennan Zhou)
(Conference Room San Felipe)
09:30 - 10:00 Marcelo Pereyra: High-dimensional Bayesian inference and convex geometry: theory, methods, and algorithms
This presentation summarises some new developments in theory, methods, and algorithms for performing Bayesian inference in high-dimensional models that are log-concave, with application to mathematical and computational imaging in convex settings. These include new efficient stochastic simulation and optimisation Bayesian computation methods that tightly combine proximal convex optimisation with Markov chain Monte Carlo techniques; strategies for estimating unknown model parameters and performing model selection, methods for calculating Bayesian confidence intervals for images and performing uncertainty quantification analyses; and new theory regarding the role of convexity in maximum-a-posteriori and minimum-mean-square-error estimation. The new theory, methods, and algorithms are illustrated with a range of mathematical imaging experiments.
(Conference Room San Felipe)
10:00 - 10:30 Kerrie Mengersen: Transferability (Conference Room San Felipe)
10:30 - 11:00 Coffee Break (Conference Room San Felipe)
11:00 - 12:30 Jonathan Mattingly: (Hands-on + discussion) Quantifying Gerrymandering by sampling the geopolitical landscape using MCMC (Conference Room San Felipe)
12:30 - 13:30 Lunch (Restaurant Hotel Hacienda Los Laureles)
13:30 - 19:00 Free Afternoon (Oaxaca)
19:00 - 21:00 Dinner (Restaurant Hotel Hacienda Los Laureles)
Thursday, November 15
07:30 - 09:00 Breakfast (Restaurant at your assigned hotel)
09:00 - 09:30 Christophe Andrieu: On the hypocoercivity of some PDMP-Monte Carlo algorithms
Monte Carlo methods based on Piecewise Deterministic Markov Processes (PDMP) have recently received some attention. In this talk we discuss (exponential) convergence to equilibrium for a broad sub-class of PDMP-MC, covering Randomized Hamiltonian Monte Carlo, the Zig-Zag process and the Bouncy Particle Sampler as particular cases, establishing hypocoercivity under fairly weak conditions and explicit bounds on the spectral gap in terms of the parameters of the dynamics. This allows us, for example, to discuss dependence of this gap in terms of the dimension of the problem for some classes of target distributions.
(Conference Room San Felipe)
09:30 - 10:00 Samuel Power: A Constructive Approach to PDMPs
Piecewise-Deterministic Markov Processes (PDMPs) have attracted attention in recent years as a non-reversible alternative to traditional reversible MCMC methods. By using a combination of deterministic dynamics and jump processes, these methods are often able to suppress random-walk behaviour and reach equilibrium rapidly. Although the PDMP framework accommodates at wide range of underlying dynamics in principle, existing approaches have tended to use quite simple dynamics, such as straight lines and elliptical orbits. In this work, I present a procedure which allows one to use a general dynamical system in the PDMP framework to sample from a given measure. Correctness of the procedure is established in a general setting, and specific, constructive recommendations are made for how to implement the resulting algorithms in practice.
(Conference Room San Felipe)
10:00 - 10:30 Luc Rey-Bellet: Thermodynamic formalism, functional inequalities and model-form UQ for stochastic processes.
How do you ascertain the uncertainty associated with your favorite (ergodic) Markov process? Suppose for example you are interested in computing steady-state expectations of some observable for the process. If think of this process as an idealized (or approximate) version of the true, but unknown, process, can we obtain performance guarantees on the steady state expectation for the true system? To do this we will investigate what are the good measures of "distance" between stochastic process and we will use (old and) new information and concentration inequalities to obtain such performance guarantees. We will illustrate our result with molecular dynamics models.
(Conference Room San Felipe)
10:30 - 11:00 Coffee Break (Conference Room San Felipe)
11:00 - 13:00 Miranda Holmes-Certon: (Hands-on + session) Sampling with constraints (Conference Room San Felipe)
13:30 - 15:00 Lunch (Restaurant Hotel Hacienda Los Laureles)
16:00 - 16:30 Coffee Break (Conference Room San Felipe)
16:30 - 17:00 Gersende Fort: Beyond Well-Tempered Metadynamics algorithms for sampling multimodal target densities
In many situations, sampling methods are considered in order to compute expectations with respect to a distribution $\pi \, d\lambda$ on $X \subset \mathbb{R}^D$ , when $\pi$ is highly multimodal. Free-energy based adaptive importance sampling techniques have been developed in the physics and chemistry literature to efficiently sample from such a target distribution. These methods are casted in the class of adaptive Markov chain Monte Carlo (MCMC) samplers: at each iteration, a sample approximating a biased distribution is drawn and the biasing strategy is learnt on the fly. As usual with importance sampling, expectations with respect to $\pi$ are obtained from a weighted mean of the samples returned by the sampler. Examples of such approaches are Wang-Landau algorithms, the Self-Healing Umbrella Sampling, adaptive biasing forces methods, the metadynamic algorithm or the well-tempered metadynamics algorithm. Nevertheless, the main drawback of most of these methods is that two antagonistic phenomena are in competition: on one hand, a mechanism to overcome the multimodality issue which is defined to force the sampler to visit a given set of strata of the space equally; on the other hand, the algorithm spends the same time in strata with high and low weight under $\pi \, d\lambda$ which makes the Monte Carlo approximation of expectations under $\pi \, d\lambda$ quite inefficient. We present a new algorithm, which generalizes all the examples mentioned above: this novel algorithm is designed to reduce the two antagonistic effects. We will show that the estimation of the local bias can be seen as a Stochastic Approximation algorithm with random step-size sequence; and the sampler as an adaptive MCMC method. We will analyze its asymptotic behavior and discuss numerically the role of some design parameters. Joint work with B. Jourdain, T. Lelièvre and G. Stoltz (from ENPC, France).
(Conference Room San Felipe)
17:00 - 17:30 David P. Sanders: Monte Carlo sampling of rare events in diffusive dynamical systems
(joint work with Diego Tapias, David P. Sanders, Eduardo G. Altmann)
(Conference Room San Felipe)
17:30 - 18:00 Anton Martinsson: Simulated Tempering Method in the Infinite Switch Limit with Adaptive Weight Learning
We discuss sampling methods based on variable temperature (simulated tempering). We show using large deviation theory (and following the technique of [Plattner et al, JCP, 2011]) that the most efficient approach in simulated tempering is to vary the temperature infinitely rapidly over a continuous range. In this limit, we can replace the equations of motion for the temperature by averaged equations, with a rescaling of the force in the equations of motion. We give a theoretical argument for the choice of the temperature weights as the reciprocal partition function, thereby relating simulated tempering to Wang-Landau sampling. Finally, we describe a self-consistent algorithm for simultaneously sampling the canonical ensemble and learning the weights during simulation. This talk describes joint work with Jianfeng Lu, Benedict Leimkuhler and Eric Vanden-Eijnden.
(Conference Room San Felipe)
19:00 - 21:00 Dinner (Restaurant Hotel Hacienda Los Laureles)
Friday, November 16
07:30 - 09:00 Breakfast (Restaurant at your assigned hotel)
09:00 - 09:30 Andrea Agazzi: Large deviations theory for chemical reaction networks
The microscopic dynamics of well-stirred networks of chemical reactions are modeled as jump Markov processes. At large volume, one may expect in this framework to have a straightforward application of large deviation theory. This is not at all true, for the jump rates are typically neither globally Lipschitz, nor bounded away from zero, with both blowup and absorption as quite possible scenarios. In joint work with Amir Dembo and Jean-Pierre Eckmann, we utilize Lyapunov stability theory to bypass this challenge and to characterize a large class of network topologies that satisfy the full Wentzell-Freidlin theory of asymptotic rates of exit from domains of attraction. The extension of these results to wider classes of networks raises fundamental questions on the effect of adding noise to arbitrarily complex dynamical landscapes.
(Conference Room San Felipe)
09:30 - 10:00 Igor Barahona: Computational statistical methods applied on conducting scientific literature reviews
In the last two decades the number of published articles and the amount of digital information has grown exponentially. This perspective increases the complexity on performing accurate and objective literature reviews. In this talk, a novel methodology based on computational statistics is introduced for investigating datasets composed by abstracts of published articles. We address to questions such as how is vocabulary commonly used in science? What are the most relevant topics for the studied journals? Which articles are the most influential? What words do authors prefer? Tow study cases, which comprises easy-to-read visual representations, are provided to illustrate how this methodology can answer the previous questions.
(Conference Room San Felipe)
10:00 - 10:30 Eric Vanden-Eijnden: Importance sampling with nonequilibrium trajectories
Sampling with a dynamics that breaks detailed balance poses a challenge because the steady state probability is not typically known. In some cases, most notably in uses of the Jarzynski estimator in statistical physics, astrophysics, and machine learning, it is possible to estimate an equilibrium average using nonequilibrium dynamics. Here, we derive a generic importance sampling technique that leverages the statistical power of configurations that have been transported by nonequilibrium trajectories. Our approach can be viewed as a continuous generalization of the Jarzynski equality that can be used to compute averages with respect to arbitrary target distributions. We illustrate the properties of estimators relying on this sampling technique in the context of density of state calculations, showing that it scales favorable with dimensionality. We also demonstrate the robustness and efficiency of the approach with an application to a Bayesian model comparison problem of the type encountered in astrophysics and machine learning. This is joint work with Grant Rotskoff (https://arxiv.org/abs/1809.11132)
(Conference Room San Felipe)
10:30 - 11:00 Coffee Break (Conference Room San Felipe)
11:00 - 12:00 Discussion and perspectives (Conference Room San Felipe)
12:00 - 14:00 Lunch (Restaurant Hotel Hacienda Los Laureles)