Skip to main content

Etalumis: Bringing Probabilistic Programming to Scientific Simulators at Scale

Atılım Güneş Baydin‚ Lei Shao‚ Wahid Bhimji‚ Lukas Heinrich‚ Lawrence F. Meadows‚ Jialin Liu‚ Andreas Munk‚ Saeid Naderiparizi‚ Bradley Gram−Hansen‚ Gilles Louppe‚ Mingfei Ma‚ Xiaohui Zhao‚ Philip Torr‚ Victor Lee‚ Kyle Cranmer‚ Prabhat and Frank Wood

Abstract

Probabilistic programming languages (PPLs) are receiving widespread attention for performing Bayesian inference in complex generative models. However, applications to science remain limited because of the impracticability of rewriting complex scientific simulators in a PPL, the computational cost of inference, and the lack of scalable implementations. To address these, we present a novel PPL framework that couples directly to existing scientific simulators through a cross-platform probabilistic execution protocol and provides Markov chain Monte Carlo (MCMC) and deep-learning-based inference compilation (IC) engines for tractable inference. To guide IC inference, we perform distributed training of a dynamic 3DCNN–LSTM architecture with a PyTorch-MPI-based framework on 1,024 32-core CPU nodes of the Cori supercomputer with a global minibatch size of 128k: achieving a performance of 450 Tflop/s through enhancements to PyTorch. We demonstrate a Large Hadron Collider (LHC) use-case with the C++ Sherpa simulator and achieve the largest-scale posterior inference in a Turing-complete PPL.

Address
New York‚ NY‚ USA
Book Title
Proceedings of the International Conference for High Performance Computing‚ Networking‚ Storage and Analysis
ISBN
9781450362290
Keywords
inference‚ probabilistic programming‚ deep learning‚ simulation
Location
Denver‚ Colorado
Publisher
Association for Computing Machinery
Series
SC ’19
Year
2019