Releases: blei-lab/edward
Releases · blei-lab/edward
1.2.0
- Version released in sync with release of the preprint, "Deep Probabilistic Programming".
Documentation
- Website documentation and API is improved (#381, #382, #383).
- Gitter channel is added (#400).
- Added docstrings to random variables (#394).
Miscellaneous
copy
is disabled for Queue operations (#384).- All
VariationalInference
methods must use build_loss_and_gradients (#385). - Logging is improved for
VariationalInference
(#337). - Fixed logging issue during inference (#391).
- Fixed
copy
function to work with lists ofRandomVariable
(#401). - Fixed bug with Theano
NameError
during inference (#395).
Acknowledgements
- Thanks go to Gilles Boulianne (@bouliagi), Nick Foti (@nfoti), Jeremy Kerfs (@jkerfs), Alp Kucukelbir (@akucukelbir), John Pearson (@jmxpearson), and @redst4r.
We are also grateful to all who filed issues or helped resolve them, asked and answered questions, and were part of inspiring discussions.
1.1.6
1.1.5
Models
RandomVariable
s now accept an optionalvalue
argument, enabling use of random variables that don't currently have sampling such asPoisson
(#326).- Documentation on model compositionality is added. [Webpage]
Inference
- Inference compositionality is added, enabling algorithms such as Expectation-Maximization and message passing (#330). [Webpage]
- Data subsampling is added, enabling proper local and global variable scaling for stochastic optimization (#327). [Webpage]
- Documentation on inference classes is added. [Webpage]
VariationalInference
has new defaults for a TensorFlow variable list as argument (#336).- Type and shape checking is improved during
__init__
ofInference
.
Miscellaneous
- Fixed an issue where a new Div node is created every Monte Carlo update (#318).
- Travis build is now functioning properly (#324).
- Coveralls is now functioning properly (#342).
tf.placeholder
can now be used instead ofed.placeholder
.- Website tutorials, documentation, and API are generally more polished.
- Fixed an issue where computation was incorrectly shared among inferences (#348).
scipy
is now an optional rather than mandatory dependency (#344).
Deprecated Features
NOTE: Several features in Edward are now deprecated (#344):
- model wrappers, including
PythonModel
,PyMC3Model
, andStanModel
—in favor of Edward's native language; - the
edward.stats
module—in favor of random variables inedward.models
; MFVI
—in favor ofKLqp
;ed.placeholder
—in favor of TensorFlow'stf.placeholder
.
Edward will continue their support for one or two more versions. They will be removed in some future release.
Acknowledgements
- Thanks go to Alp Kucukelbir (@akucukelbir), Dawen Liang (@dawenl), John Pearson (@jmxpearson), Hayate Iso (@isohyt), Marmaduke Woodman (@maedoc), and Matthew Hoffman (@matthewdhoffman).
We are also grateful to all who filed issues or helped resolve them, asked and answered questions, and were part of inspiring discussions.
1.1.4
- Small miscellaneous bug fixes.
- Website's API and documentation pages are overhauled.
- A white paper for Edward is released [arXiv:1610.09787].
1.1.3
Models
- New random variables and methods are added (#256, #274). For example, random variables such as
Mixture
,QuantizedDistribution
,WishartCholesky
, and methods such assurvival_function()
. - Random variables and methods are now automatically generated from
tf.contrib.distributions
(#276). Edward random variables are minimal and adapt to the TensorFlow version.
Inference
Inference
Monte Carlo
- Significant infrastructure for Monte Carlo is added (#254, #255). This makes it easy to develop new Monte Carlo methods.
- Metropolis-Hastings is implemented (#255)
- Hamiltonian Monte Carlo is implemented (#269).
- Stochastic gradient Langevin dynamics is implemented (#272).
Variational inference
- Black box-style methods are refactored internally (#249).
Documentation
- The website tutorials are placed in a directory and have clean links (#263, #264).
- Initial progress is made on iPython notebook versions of the tutorials (#261).
- The website API is revamped (#268). Everything is now LaTeX-sourced, and the Delving In page is moved to the frontpage of the API.
Miscellaneous
- Printing behavior of random variables is changed (#276).
edward.criticisms
is its own subpackage (#258).- The TensorFlow dependency is now
>=0.11.0rc0
(#274).
Acknowledgements
- Thanks go to Alp Kucukelbir (@akucukelbir), Bhargav Srinivasa (@bhargavvader), and Justin Bayer (@bayerj).
We are also grateful to all who filed issues or helped resolve them, asked and answered questions, and were part of inspiring discussions.
1.1.2
Functionality
- A new modeling language is added, which exposes model structure to the user. This enables development of both model-specific and generic inference algorithms (#239).
- All of inference and criticism is updated to support the new language and also be backward-compatible with the model wrappers (#239).
Documentation
- All of the website is updated to reflect the new modeling language (#252).
- Several existing tutorials now use the modeling language instead of a model wrapper (#252).
Examples
- The
examples/
directory is restructured (#251). - Many examples with the modeling language are added:
- Toy demonstrations of several probabilistic programming concepts are
added:
Miscellaneous
- The TensorFlow dependency is now
>=0.10.0
. - Momentum optimizer argument is fixed (#246).
1.1.1
Functionality
- The API for inference and criticism is changed. It is a more intuitive interface that allows for multiple sets of latent variables (#192).
- The API for variational models is changed (#237). The user must explicitly define the parameters that he or she wishes to train; this allows for more flexibility in how to initialize and train variational parameters.
edward.models
is refactored to incorporate all random variables intf.contrib.distributions
(#237). This speeds up computation, is more robust, and supports additional distributions and distribution methods.edward.stats
is refactored to have its main internals reside intf.contrib.distributions
(#238). This speeds up computation, is more robust, and supports additional distributions and distribution methods.
Documentation
- All of the website is updated to reflect the new API changes.
- The contributing page is revamped.
Examples
- An inference networks tutorial is added.
- A mixture density networks tutorial is added.
Testing
py.test
is now the testing tool of choice.- Code now follows all of PEP8, with the exception of two-space indenting following TensorFlow's style guide (#214, #215, #216, #217, #218, #219, #220, #221, #223, #225, #227, #228, #229, #230).
- Travis automates checking for PEP8.
- (minimal) Tensorboard support is added. Specifically, one can now visualize the computational graph used during inference.
Miscellaneous
- The TensorFlow dependency is now
>=0.10.0rc0
. ed.__version__
displays Edward's version.ed.set_seed()
is more robust, checking to see if any random ops were created prior to setting the seed.
1.1.0
Functionality
- Three ways to read data are supported, enabling the range from storing data in memory within TensorFlow's computational graph to manually feeding data to reading data from files. (see #170)
- Support for Python 3 is added.
- The naming scheme for various attributes is made consistent. (see #162 (comment))
Documentation
- The website is given a complete overhaul, now with getting started and delving in pages, in-depth tutorials, and an API describing the design of Edward and autogenerated doc for each function in Edward. (see #149)
Examples
- Importance-weighted variational inference is added.
- Latent space model is added.
1.0.9
- There is now one data object to rule them all: a Python dictionary. (see #156)
- Distribution objects can be of arbitrary shape. For example, a 5 x 2 matrix of Normal random variables is declared with
x = Normal([5, 2])
. (see #138)
Documentation
- All of Edward is documented. (see #148)
- Edward now follows TensorFlow style guidelines.
- A tutorial on black box variational inference is available. (see #153)
Miscellaneous
- We now use the special functions and their automatic differentation available in TensorFlow, e.g.,
tf.lgamma
,tf.digamma
,tf.lbeta
. - Sampling via NumPy/SciPy is done using a
tf.py_func
wrapper, speeding up sampling and avoiding internal overhead from the previous solution. (see #160) - Sampling via reparameterizable distributions now follows the convention of
tf.contrib.distributions
. (see #161) - Fixed bug where a class copy of the
layers
object inVariational
is done (see #119)
1.0.8
- distributions can now be specified with parameters, simplifying use of inference networks, alternative parameterizations, and much of the internals for developing new inference algorithms; see #126
- TensorFlow session is now a global variable and can simply be accessed with
get_session()
; see #117 - added Laplace approximation
- added utility function to calculate hessian of TensorFlow tensors