Code for reproducing the experiments in the paper:
G. Papamakarios and I. Murray, Fast ε-free Inference of Simulation Models with Bayesian Conditional Density Estimation, NeurIPS 2016. [arXiv] [bibtex]
Folder containing four subfolders, one for each demo in the paper.
-
mixture_of_gaussians_demo
mog_main.py
--- sets up the modelmog_abc.py
--- runs ABC methodsmog_mdn.py
--- runs MDN methodsmog_res.py
--- collects and plots results
-
bayesian_linear_regression_demo
blr_main.py
--- sets up the modelblr_abc.py
--- runs ABC methodsblr_mdn.py
--- runs MDN methodsblr_res.py
--- collects and plots results
-
lotka_volterra_demo
lv_main.py
--- sets up the modellv_abc.py
--- runs ABC methodslv_mdn.py
--- runs MDN methodslv_res.py
--- collects and plots results
-
mg1_queue_demo
mg1_main.py
--- sets up the modelmg1_abc.py
--- runs ABC methodsmg1_mdn.py
--- runs MDN methodsmg1_res.py
--- collects and plots results
Folder with utility classes and functions.
-
pdf.py
Gaussians and mixtures of Gaussians -
NeuralNet.py
neural nets with and without SVI -
mdn.py
MDNs with and without SVI -
DataStream.py
provides data minibatches for training -
LossFunction.py
loss functions for training -
StepStrategy.py
optimization algorithms, including Adam -
Trainer.py
trains a neural net or MDN, SVI or not -
MarkovJumpProcess.py
Markov jump processes, including Lotka--Volterra -
helper.py
various helper functions