Edward provides a testbed for rapid experimentation and research with probabilistic models. Here we show how to apply this process for complicated learning tasks.
Bayesian linear regression
A fundamental model for supervised learning.
Gaussian process classification
Learning a distribution over functions for supervised classification.
Unsupervised learning by clustering data points.
Latent space models
Analyzing connectivity patterns in neural data.
Mixture density networks
A neural density estimator for solving inverse problems.
Generative adversarial networks
Building a deep generative model of MNIST digits.
A model of latent codes in information theory.
How to amortize computation for training and testing models.
Bayesian neural network
Bayesian analysis with neural networks.
If you’re interested in contributing a tutorial, checking out the contributing page. For more background and notation, see the pages below.