Skip to content

Pretty variational docs #2157

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 4 commits into from
May 9, 2017
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
65 changes: 38 additions & 27 deletions pymc3/variational/approximations.py
Original file line number Diff line number Diff line change
Expand Up @@ -28,24 +28,25 @@ class MeanField(Approximation):
mapping {model_variable -> local_variable (:math:`\\mu`, :math:`\\rho`)}
Local Vars are used for Autoencoding Variational Bayes
See (AEVB; Kingma and Welling, 2014) for details
model : PyMC3 model for inference
start : Point
model : :class:`Model`
PyMC3 model for inference
start : `Point`
initial mean
cost_part_grad_scale : float or scalar tensor
cost_part_grad_scale : `scalar`
Scaling score part of gradient can be useful near optimum for
archiving better convergence properties. Common schedule is
1 at the start and 0 in the end. So slow decay will be ok.
See (Sticking the Landing; Geoffrey Roeder,
Yuhuai Wu, David Duvenaud, 2016) for details
scale_cost_to_minibatch : bool, default False
Scale cost to minibatch instead of full dataset
scale_cost_to_minibatch : `bool`
Scale cost to minibatch instead of full dataset, default False
seed : None or int
leave None to use package global RandomStream or other
valid value to create instance specific one

References
----------
Geoffrey Roeder, Yuhuai Wu, David Duvenaud, 2016
- Geoffrey Roeder, Yuhuai Wu, David Duvenaud, 2016
Sticking the Landing: A Simple Reduced-Variance Gradient for ADVI
approximateinference.org/accepted/RoederEtAl2016.pdf
"""
Expand Down Expand Up @@ -121,10 +122,15 @@ class FullRank(Approximation):
seed : None or int
leave None to use package global RandomStream or other
valid value to create instance specific one

Other Parameters
----------------
gpu_compat : bool
use GPU compatible version or not

References
----------
Geoffrey Roeder, Yuhuai Wu, David Duvenaud, 2016
- Geoffrey Roeder, Yuhuai Wu, David Duvenaud, 2016
Sticking the Landing: A Simple Reduced-Variance Gradient for ADVI
approximateinference.org/accepted/RoederEtAl2016.pdf
"""
Expand Down Expand Up @@ -211,17 +217,17 @@ def from_mean_field(cls, mean_field, gpu_compat=False):

Parameters
----------
mean_field : MeanField
mean_field : :class:`MeanField`
approximation to start with

Flags
-----
gpu_compat : bool
Other Parameters
----------------
gpu_compat : `bool`
use GPU compatible version or not

Returns
-------
FullRank
:class:`FullRank`
"""
full_rank = object.__new__(cls) # type: FullRank
full_rank.gpu_compat = gpu_compat
Expand All @@ -247,15 +253,16 @@ class Empirical(Approximation):

Parameters
----------
trace : MultiTrace
trace : :class:`MultiTrace`
local_rv : dict[var->tuple]
Experimental for Empirical Approximation
mapping {model_variable -> local_variable (:math:`\\mu`, :math:`\\rho`)}
Local Vars are used for Autoencoding Variational Bayes
See (AEVB; Kingma and Welling, 2014) for details
scale_cost_to_minibatch : bool, default False
Scale cost to minibatch instead of full dataset
model : PyMC3 model
scale_cost_to_minibatch : `bool`
Scale cost to minibatch instead of full dataset, default False
model : :class:`Model`
PyMC3 model for inference
seed : None or int
leave None to use package global RandomStream or other
valid value to create instance specific one
Expand Down Expand Up @@ -356,23 +363,26 @@ def from_noise(cls, size, jitter=.01, local_rv=None,

Parameters
----------
size : number of initial particles
jitter : initial sd
local_rv : dict
size : `int`
number of initial particles
jitter : `float`
initial sd
local_rv : `dict`
mapping {model_variable -> local_variable}
Local Vars are used for Autoencoding Variational Bayes
See (AEVB; Kingma and Welling, 2014) for details
start : initial point
model : pm.Model
PyMC3 Model
start : `Point`
initial point
model : :class:`Model`
PyMC3 model for inference
seed : None or int
leave None to use package global RandomStream or other
valid value to create instance specific one
kwargs : other kwargs passed to init

Returns
-------
Empirical
:class:`Empirical`
"""
hist = cls(None, local_rv=local_rv, model=model, seed=seed, **kwargs)
if start is None:
Expand All @@ -394,15 +404,16 @@ def sample_approx(approx, draws=100, include_transformed=True):

Parameters
----------
approx : Approximation
draws : int
approx : :class:`Approximation`
Approximation to sample from
draws : `int`
Number of random samples.
include_transformed : bool
include_transformed : `bool`
If True, transformed variables are also sampled. Default is True.

Returns
-------
trace : pymc3.backends.base.MultiTrace
trace : class:`pymc3.backends.base.MultiTrace`
Samples drawn from variational posterior.
"""
if not isinstance(approx, Approximation):
Expand Down
Loading