API and Documentation

Model

A probabilistic model is a joint distribution \(p(\mathbf{x}, \mathbf{z})\) of data \(\mathbf{x}\) and latent variables \(\mathbf{z}\). For background, see the Probability Models tutorial.

In Edward, we specify models using a simple language of random variables. A random variable \(\mathbf{x}\) is an object parameterized by tensors \(\theta^*\), where the number of random variables in one object is determined by the dimensions of its parameters.

from edward.models import Normal, Exponential

# univariate normal
Normal(mu=tf.constant(0.0), sigma=tf.constant(1.0))
# vector of 5 univariate normals
Normal(mu=tf.zeros(5), sigma=tf.ones(5))
# 2 x 3 matrix of Exponentials
Exponential(lam=tf.ones([2, 3]))

For multivariate distributions, the multivariate dimension is the innermost (right-most) dimension of the parameters.

from edward.models import Dirichlet, MultivariateNormalFull

# K-dimensional Dirichlet
Dirichlet(alpha=tf.constant([0.1] * K)
# vector of 5 K-dimensional multivariate normals
MultivariateNormalFull(mu=tf.zeros([5, K]), sigma=...)
# 2 x 5 matrix of K-dimensional multivariate normals
MultivariateNormalFull(mu=tf.zeros([2, 5, K]), sigma=...)

Random variables are equipped with methods such as log_prob(), \(\log p(\mathbf{x}\mid\theta^*)\), mean(), \(\mathbb{E}_{p(\mathbf{x}\mid\theta^*)}[\mathbf{x}]\), and sample(), \(\mathbf{x}^*\sim p(\mathbf{x}\mid\theta^*)\). Further, each random variable is associated to a tensor \(\mathbf{x}^*\) in the computational graph, which represents a single sample \(\mathbf{x}^*\sim p(\mathbf{x}\mid\theta^*)\).

This makes it easy to parameterize random variables with complex deterministic structure, such as with deep neural networks, a diverse set of math operations, and compatibility with third party libraries which also build on TensorFlow. The design also enables compositions of random variables to capture complex stochastic structure. They operate on \(\mathbf{x}^*\).

image

from edward.models import Normal

x = Normal(mu=tf.zeros(10), sigma=tf.ones(10))
y = tf.constant(5.0)
x + y, x - y, x * y, x / y
tf.tanh(x * y)
tf.gather(x, 2)  # 3rd normal rv in the vector

In the compositionality page, we describe how to build models by composing random variables.


For a list of random variables supported in Edward, see the TensorFlow distributions API. Edward random variables build on top of them, inheriting the same arguments and class methods. Additional methods are also available, detailed below.

class edward.models.RandomVariable(*args, **kwargs)[source]

Base class for random variables.

A random variable is an object parameterized by tensors. It is equipped with methods such as the log-density, mean, and sample.

It also wraps a tensor, where the tensor corresponds to a sample from the random variable. This enables operations on the TensorFlow graph, allowing random variables to be used in conjunction with other TensorFlow ops.

Notes

RandomVariable assumes use in a multiple inheritance setting. The child class must first inherit RandomVariable, then second inherit a class in tf.contrib.distributions. With Python’s method resolution order, this implies the following during initialization (using distributions.Bernoulli as an example):

  1. Start the __init__() of the child class, which passes all *args, **kwargs to RandomVariable.
  2. This in turn passes all *args, **kwargs to distributions.Bernoulli, completing the __init__() of distributions.Bernoulli.
  3. Complete the __init__() of RandomVariable, which calls
self.sample(), relying on the method from distributions.Bernoulli.
  1. Complete the __init__() of the child class.

Methods from both RandomVariable and distributions.Bernoulli populate the namespace of the child class. Methods from RandomVariable will take higher priority if there are conflicts.

Examples

>>> p = tf.constant(0.5)
>>> x = Bernoulli(p=p)
>>>
>>> z1 = tf.constant([[2.0, 8.0]])
>>> z2 = tf.constant([[1.0, 2.0]])
>>> x = Bernoulli(p=tf.matmul(z1, z2))
>>>
>>> mu = Normal(mu=tf.constant(0.0), sigma=tf.constant(1.0))
>>> x = Normal(mu=mu, sigma=tf.constant(1.0))


Methods

value()[source]

Get tensor that the random variable corresponds to.

get_ancestors(collection=None)[source]

Get ancestor random variables.

get_children(collection=None)[source]

Get child random variables.

get_descendants(collection=None)[source]

Get descendant random variables.

get_parents(collection=None)[source]

Get parent random variables.

get_siblings(collection=None)[source]

Get sibling random variables.

get_variables(collection=None)[source]

Get TensorFlow variables that the random variable depends on.

class edward.models.Empirical[source]
class edward.models.PointMass[source]