Module: ed.inferences

Defined in edward/inferences/__init__.py.

Classes

class BiGANInference: Adversarially Learned Inference (Dumoulin et al., 2017) or

class GANInference: Parameter estimation with GAN-style training

class Gibbs: Gibbs sampling (Geman & Geman, 1984).

class HMC: Hamiltonian Monte Carlo, also known as hybrid Monte Carlo

class ImplicitKLqp: Variational inference with implicit probabilistic models

class Inference: Abstract base class for inference. All inference algorithms in

class KLpq: Variational inference with the KL divergence

class KLqp: Variational inference with the KL divergence

class Laplace: Laplace approximation (Laplace, 1986).

class MAP: Maximum a posteriori.

class MetropolisHastings: Metropolis-Hastings (Hastings, 1970; Metropolis, Rosenbluth, Rosenbluth, Teller, & Teller, 1953).

class MonteCarlo: Abstract base class for Monte Carlo. Specific Monte Carlo methods

class ReparameterizationEntropyKLqp: Variational inference with the KL divergence

class ReparameterizationKLKLqp: Variational inference with the KL divergence

class ReparameterizationKLqp: Variational inference with the KL divergence

class ReplicaExchangeMC: Replica Exchange MCMC (Hukushima & Nemoto, 1996; Swendsen & Wang, 1986).

class SGHMC: Stochastic gradient Hamiltonian Monte Carlo (Chen, Fox, & Guestrin, 2014).

class SGLD: Stochastic gradient Langevin dynamics (Welling & Teh, 2011).

class ScoreEntropyKLqp: Variational inference with the KL divergence

class ScoreKLKLqp: Variational inference with the KL divergence

class ScoreKLqp: Variational inference with the KL divergence

class ScoreRBKLqp: Variational inference with the KL divergence

class VariationalInference: Abstract base class for variational inference. Specific

class WGANInference: Parameter estimation with GAN-style training

class WakeSleep: Wake-Sleep algorithm (Hinton, Dayan, Frey, & Neal, 1995).

Functions

complete_conditional(...): Returns the conditional distribution RandomVariable

Chen, T., Fox, E. B., & Guestrin, C. (2014). Stochastic gradient Hamiltonian Monte Carlo. In International conference on machine learning.

Dumoulin, V., Belghazi, I., Poole, B., Lamb, A., Arjovsky, M., Mastropietro, O., & Courville, A. (2017). Adversarially Learned Inference. In International conference on learning representations.

Geman, S., & Geman, D. (1984). Stochastic relaxation, gibbs distributions, and the bayesian restoration of images. IEEE Transactions on Pattern Analysis and Machine Intelligence, (6), 721–741.

Hastings, W. K. (1970). Monte Carlo sampling methods using Markov chains and their applications. Biometrika, 57(1), 97–109.

Hinton, G. E., Dayan, P., Frey, B. J., & Neal, R. M. (1995). The "wake-sleep" algorithm for unsupervised neural networks. Science.

Hukushima, K., & Nemoto, K. (1996). Exchange monte carlo method and application to spin glass simulations. Journal of the Physical Society of Japan, 65(6), 1604–1608.

Laplace, P. S. (1986). Memoir on the probability of the causes of events. Statistical Science, 1(3), 364–378.

Metropolis, N., Rosenbluth, A. W., Rosenbluth, M. N., Teller, A. H., & Teller, E. (1953). Equation of state calculations by fast computing machines. The Journal of Chemical Physics, 21(6), 1087–1092.

Swendsen, R. H., & Wang, J.-S. (1986). Replica monte carlo simulation of spin-glasses. Physical Review Letters, 57(21), 2607.

Welling, M., & Teh, Y. W. (2011). Bayesian learning via stochastic gradient Langevin dynamics. In International conference on machine learning.