Skip to content

Releases: tensorflow/probability

Tensorflow Probability 0.10.0-rc0

15 Apr 19:28
db388e6
Compare
Choose a tag to compare
Pre-release

This is the RC0 release candidate of the Tensorflow Probability 0.10 release. It is tested against Tensorflow 2.2.0-rc3.

TensorFlow Probability 0.9

15 Jan 05:30
356cfdd
Compare
Choose a tag to compare

Release notes

This is the 0.9 release of TensorFlow Probability. It is tested and stable against TensorFlow version 2.1.0.

NOTE: The 0.9 releases of TensorFlow Probability will be the last to support Python 2. Future versions of TensorFlow Probability will require Python 3.5 or later.

Change notes

  • Distributions

    • Add Pixel CNN++ distribution.
    • Breaking change: Remove deprecated behavior of Poisson.rate and Poisson.log_rate.
    • Breaking change: Remove deprecated behavior of logits, probs properties.
    • Add _default_event_space_bijector to distributions.
    • Add validation that samples are within the support of the distribution.
    • Support positional and keyword args to JointDistribution.prob and JointDistribution.log_prob.
    • Support OrderedDict dtype in JointDistributionNamed.
    • tfd.BatchReshape is tape-safe
    • More accurate survival function and CDF for the generalized Pareto distribution.
    • Added Plackett-Luce distribution over permutations.
    • Fix long-standing bug with cdf, survival_function, and quantile for TransformedDistributions having decreasing bijectors.
    • Export the DoubleMaxwell distribution.
    • Add method for analytic Bayesian linear regression with LinearOperators.
  • Bijectors

    • Breaking change: Scalar bijectors must implement _is_increasing if using cdf/survival_function/quantile on TransformedDistribution. This supports resolution of a long-standing bug, e.g. tfb.Scale(scale=-1.)(tfd.HalfNormal(0,1)).cdf was incorrect.
    • Deprecate tfb.masked_autoregressive_default_template.
    • Fixed inverse numerical stability bug in tfb.Softfloor
    • Tape-safe Reshape bijector.
  • MCMC

    • Optimize tfp.mcmc.ReplicaExchangeMonteCarlo by replacing TF control flow and
    • ReplicaExchangeMC now can trace exchange proposals/acceptances.
    • Correct implementation of log_accept_ratio in NUTS
    • Return non-cumulated leapfrogs_taken in nuts kernel_result.
    • Make unrolled NUTS reproducible.
    • Bug fix of Generalized U-turn in NUTS.
    • Reduce NUTS test flakiness.
    • Fix convergence test for NUTS.
    • Switch back to original U turn criteria in Hoffman & Gelman 2014.
    • Make autobatched NUTS reproducible.
  • STS

    • Update example "Structural Time Series Modeling Case Studies" to TF2.0 API.
    • Add fast path for sampling STS LocalLevel models.
    • Support posterior sampling in linear Gaussian state space models.
    • Add a fast path for Kalman smoothing with scalar latents.
    • Add option to disallow drift in STS Seasonal models.
  • Breaking change: Removed a number of functions, methods, and classes that were deprecated in TensorFlow Probability 0.8.0 or earlier.

    • Remove deprecated trainable_distributions_lib.
    • Remove deprecated property Dirichlet.total_concentration.
    • Remove deprecated tfb.AutoregressiveLayer -- use tfb.AutoregressiveNetwork.
    • Remove deprecated tfp.distributions.* methods.
    • Remove deprecated tfp.distributions.moving_mean_variance.
    • Remove two deprecated tfp.vi functions.
    • Remove deprecated tfp.distributions.SeedStream -- use tfp.util.SeedStream.
    • Remove deprecated properties of tfd.Categorical.
  • Other

    • Add make_rank_polymorphic utility, which lifts a callable to a vectorized callable.
    • Dormand-Prince solver supports nested structures. Implemented adjoint sensitivity method for Dormand-Prince solver gradients.
    • Run Travis tests against latest tf-estimator-nightly.
    • Supporting gast 0.3 +
    • Add tfp.vi.build_factored_surrogate_posterior utility for automatic black-box variational inference.

Huge thanks to all the contributors to this release!

  • Aditya Grover
  • Alexey Radul
  • Anudhyan Boral
  • Arthur Lui
  • Billy Lamberta
  • Brian Patton
  • Christopher Suter
  • Colemak
  • Dan Moldovan
  • Dave Moore
  • Dmitrii Kochkov
  • Edward Loper
  • Emily Fertig
  • Ian Langmore
  • Jacob Burnim
  • Joshua V. Dillon
  • Junpeng Lao
  • Katherine Wu
  • Kibeom Kim
  • Kristian Hartikainen
  • Mark Daoust
  • Pavel Sountsov
  • Peter Hawkins
  • refraction-ray
  • RJ Skerry-Ryan
  • Sanket Kamthe
  • Sergei Lebedev
  • Sharad Vikram
  • Srinivas Vasudevan
  • Yanhua Sun
  • Yash Katariya
  • Zachary Nado

TensorFlow Probability 0.8

01 Oct 15:39
b696815
Compare
Choose a tag to compare

Release notes

This is the 0.8 release of TensorFlow Probability. It is tested and stable against TensorFlow version 2.0.0 and 1.15.0rc1.

Change notes

  • GPU-friendly "unrolled" NUTS: tfp.mcmc.NoUTurnSampler

    • Open-source the unrolled implementation of the No U-Turn Sampler.
    • Switch back to original U turn criteria in Hoffman & Gelman 2014.
    • Bug fix in Unrolled NUTS to make sure it does not lose shape for event_shape=1.
    • Bug fix of U turn check in Unrolled NUTS at the tree extension.
    • Refactor U turn check in Unrolled NUTS.
    • Fix dynamic shape bug in Unrolled NUTS.
    • Move NUTS unrolled into mcmc, with additional clean up.
    • Make sure the unrolled NUTS sampler handle scalar target_log_probs correctly.
    • Change implementation of check U turn to using a tf.while_loop in unrolled NUTS.
    • Implement multinomial sampling across tree (instead of Slice sampling) in unrolled NUTS.
    • Expose additional diagnostics in previous_kernel_results in unrolled NUTS so that it works with *_step_size_adaptation.
  • MCMC

    • Modify the shape handling in DualAveragingStepSizeAdaptation so that it works with non-scalar event_shape.
    • support structured samples in tfp.monte_carlo.expectation.
    • Minor fix for docstring example in leapfrog_integrator
  • VI

    • Add utilities for fitting variational distributions.
    • Improve Csiszar divergence support for joint variational distributions.
    • ensure that joint distributions are correctly recognized as reparameterizable by monte_carlo_csiszar_f_divergence.
    • Rename monte_carlo_csiszar_f_divergence to monte_carlo_variational_loss.
    • Refactor tfp.vi.csiszar_vimco_helper to expose useful leave-one-out statistical tools.
  • Distributions

    • Added tfp.distributions.GeneralizedPareto
    • Multinomial and DirichletMultinomial samplers are now reproducible.
    • HMM samples are now reproducible.
    • Cleaning up unneeded conversion to tensor in quantile().
    • Added support for dynamic num_steps in HiddenMarkovModel
    • Added implementation of quantile() for exponential distributions.
    • Fix entropy of Categorical distribution when logits contains -inf.
    • Annotate float-valued Deterministic distributions as reparameterized.
    • Establish patterns which ensure that TFP objects are "GradientTape Safe."
    • "GradientTape-safe" distributions: FiniteDiscrete, VonMises, Binomial, Dirichlet, Multinomial, DirichletMultinomial, Categorical, Deterministic
    • Add tfp.util.DeferredTensor to delay Tensor operations on tf.Variables (also works for tf.Tensors).
    • Add probs_parameter, logits_parameter member functions to Categorical-like distributions. In the future users should use these new functions rather than probs/logits properties because the properties might be None if that's how the distribution was parameterized.
  • Bijectors

    • Add log_scale parameter to AffineScalar bijector.
    • Added tfp.bijectors.RationalQuadraticSpline.
    • Add SoftFloor bijector. (Note: Known inverse bug WIP.)
    • Allow using an arbitrary bijector in RealNVP for the coupling.
    • Allow using an arbitrary bijector in MaskedAutoregressiveFlow for the coupling.
  • Experimental auto-batching system: tfp.experimental.auto_batching

    • Open-source the program-counter-based auto-batching system.
    • Added tfp.experimental.auto_batching, an experimental system to recover batch parallelism across recursive function invocations.
    • Autobatched NUTS supports batching across consecutive trajectories.
    • Add support for field references to autobatching.
    • Increase the amount of Python syntax that "just works" in autobatched functions.
    • pop-push fusion optimization in the autobatching system (also recently did tail-call optimization but forgot to add a relnote).
    • Open-source the auto-batched implementation of the No U-Turn Sampler.
  • STS

    • Support TF2/Eager-mode fitting of STS models, and deprecate build_factored_variational_loss.
    • Use dual averaging step size adaptation for STS HMC fitting.
    • Add support for imputing missing values in structural time series models.
    • Standardize parameter scales during STS inference.
  • Layers

    • Add WeightNorm layer wrapper.
    • Fix gradients flowing through variables in the old style variational layers.
    • tf.keras.model.save_model and model.save now defaults to saving a TensorFlow SavedModel.
  • Stats/Math

    • Add calibration metrics to tfp.stats.
    • Add output_gradients argument to value_and_gradient.
    • Add Geyer initial positive sequence truncation criterion to tfp.mcmc.effective_sample_size.
    • Resolve shape inconsistencies in PSDKernels API.
    • Support dynamic-shaped results in tfp.math.minimize.
    • ODE: Implement the Adjoint Method for gradients with respect to the initial state.

Huge thanks to all the contributors to this release!

  • Alexey Radul
  • Anudhyan Boral
  • Arthur Lui
  • Brian Patton
  • Christopher Suter
  • Colin Carroll
  • Dan Moldovan
  • Dave Moore
  • Edward Loper
  • Emily Fertig
  • Gaurav Jain
  • Ian Langmore
  • Igor Ganichev
  • Jacob Burnim
  • Jeff Pollock
  • Joshua V. Dillon
  • Junpeng Lao
  • Katherine Wu
  • Mark Daoust
  • Matthieu Coquet
  • Parsiad Azimzadeh
  • Pavel Sountsov
  • Pavithra Vijay
  • PJ Trainor
  • prabhu prakash kagitha
  • prakashkagitha
  • Reed Wanderman-Milne
  • refraction-ray
  • Rif A. Saurous
  • RJ Skerry-Ryan
  • Saurabh Saxena
  • Sharad Vikram
  • Sigrid Keydana
  • skeydan
  • Srinivas Vasudevan
  • Yash Katariya
  • Zachary Nado

TensorFlow Probability 0.8.0-rc0

30 Aug 04:27
cf6a559
Compare
Choose a tag to compare
Pre-release

This is the RC0 release candidate of the TensorFlow Probability 0.8 release.

It is tested against TensorFlow 2.0.0-rc0

TensorFlow Probability 0.7

20 Jun 18:10
09929ea
Compare
Choose a tag to compare

Release notes

This is the 0.7 release of TensorFlow Probability. It is tested and stable against TensorFlow version 1.14.0.

Change notes

  • Internal optimizations to HMC leapfrog integrator.
  • Add FeatureTransformed, FeatureScaled, and KumaraswamyTransformed PSD kernels
  • Added tfp.debugging.benchmarking.benchmark_tf_function.
  • Added optional masking of observations for hidden_markov_model methods posterior_marginals and posterior_mode.
  • Fixed evaluation order of distributions within JointDistributionNamed
  • Rename tfb.AutoregressiveLayer to tfb.AutoregressiveNetwork.
  • Support kernel and bias constraints/regularizers/initializers in tfb.AutoregressiveLayer.
  • Created Backward Difference Formula (BDF) solver for stiff ODEs.
  • Update Cumsum bijector.
  • Add distribution layer for masked autoregressive flow in Keras.
  • Shorten repr, str Distribution strings by using "?" instead of "<unknown>" to represent None.
  • Implement FiniteDiscrete distribution
  • Add Cumsum bijector.
  • Make Seasonal STS more flexible to handle none constant num_steps_per_season for each season.
  • In tfb.BatchNormalization, use keras layer over compat.v1 layer.
  • Forward kwargs in MaskedAutoregressiveFlow.
  • Added tfp.math.pivoted_cholesky for low rank preconditioning.
  • Add tfp.distributions.JointDistributionCoroutine for specifying simple directed graphical models via Python generators.
  • Complete the example notebook demonstrating multilevel modeling using TFP.
  • Remove default None initializations for Beta and LogNormal parameters.
  • Bug fix in init method of Rational quadratic kernel
  • Add Binomial.sample method.
  • Add SparseLinearRegression structural time series component.
  • Remove TFP support of KL Divergence calculation of tf.compat.v1.distributions which have been deprecated for 6 months.
  • Added tfp.math.cholesky_concat (adds columns to a cholesky decomposition)
  • Introduce SchurComplement PSD Kernel
  • Add EllipticalSliceSampler as an experimental MCMC kernel.
  • Remove intercepting/reuse of variables created within DistributionLambda.
  • Support missing observations in structural time series models.
  • Add Keras layer for masked autoregressive flows.
  • Add code block to show recommended style of using JointDistribution.
  • Added example notebook demonstrating multilevel modeling.
  • Correctly decorate the training block in the VI part of the JointDistribution example notebook.
  • Add tfp.distributions.Sample for specifying plates in tfd.JointDistribution*.
  • Enable save/load of Keras models with DistributionLambda layers.
  • Add example notebook to show how to use joint distribution sequential for small-median Bayesian graphical model.
  • Add NaN propagation to tfp.stats.percentile.
  • Add tfp.distributions.JointDistributionSequential for specifying simple directed graphical models.
  • Enable save/load of models with IndependentX or MixtureX layers.
  • Extend monte_carlo_csiszar_f_divergence so it also work with JointDistribution.
  • Fix typo in value_and_gradient docstring.
  • Add SimpleStepSizeAdaptation, deprecate step_size_adaptation_fn.
  • batch_interp_regular_nd_grid added to tfp.math
  • Adds IteratedSigmoidCentered bijector to unconstrain unit simplex.
  • Add option to constrain seasonal effects to zero-sum in STS models, and enable by default.
  • Add two-sample multivariate equality in distribution.
  • Fix broadcasting errors when forecasting STS models with batch shape.
  • Adds batch slicing support to most distributions in tfp.distributions.
  • Add tfp.layers.VariationalGaussianProcess.
  • Added posterior_mode to HiddenMarkovModel
  • Add VariationalGaussianProcess distribution.
  • Adds slicing of distributions batch axes as dist[..., :2, tf.newaxis, 3]
  • Add tfp.layers.VariableLayer for making a Keras model which ignores inputs.
  • tfp.math.matrix_rank.
  • Add KL divergence between two blockwise distributions.
  • tf.function decorate tfp.bijectors.
  • Add Blockwise distribution for concatenating different distribution families.
  • Add and begin using a utility for varying random seeds in tests when desired.
  • Add two-sample calibrated statistical test for equality of CDFs, incl. support for duplicate samples.
  • Deprecating obsolete moving_mean_variance. Use assign_moving_mean_variance and manage the variables explicitly.
  • Migrate Variational SGD Optimizer to TF 2.0
  • Migrate SGLD Optimizer to TF 2.0
  • TF2 migration
  • Make all test in MCMC TF2 compatible.
  • Expose HMC parameters via kernel results.
  • Implement a new version of sample_chain with optional tracing.
  • Make MCMC diagnostic tests Eager/TF2 compatible.
  • Implement Categorical to Discrete Values bijector, which maps integer x (0<=x<K) to values[x], where values is a predefined 1D tensor with size K.
  • Run dense, conv variational layer tests in eager mode.
  • Add Empirical distribution to Edward2 (already exists as a TFP distribution).
  • Ensure Gumbel distribution does not produce inf samples.
  • Hid tensor shapes from operators in HMM tests
  • Added Empirical distribution
  • Add the Blockwise bijector.
  • Add MixtureNormal and MixtureLogistic distribution layers.
  • Experimental support for implicit reparameterization gradients in MixtureSameFamily
  • Fix parameter broadcasting in DirichletMultinomial.
  • Add tfp.math.clip_by_value_preserve_gradient.
  • Rename InverseGamma rate parameter to scale, to match its semantics.
  • Added option 'input_output_cholesky' to LKJ distribution.
  • Add a semi-local linear trend STS model component.
  • Added Proximal Hessian Sparse Optimizer (a variant of Newton-Raphson).
  • find_bins(x, edges, ...) added to tfp.stats.
  • Disable explicit caching in masked_autoregressive in eager mode.
  • Add a local level STS model component.
  • Docfix: Fix constraint on valid range of reinterpreted_batch_dims for Independent.

Huge thanks to all the contributors to this release!

  • Alexey Radul
  • Anudhyan Boral
  • axch
  • Brian Patton
  • cclauss
  • Chikanaga Tomoyuki
  • Christopher Suter
  • Clive Chan
  • Dave Moore
  • Gaurav Jain
  • harrismirza
  • Harris Mirza
  • Ian Langmore
  • Jacob Burnim
  • Janosh Riebesell
  • Jeff Pollock
  • Jiri Simsa
  • joeyhaohao
  • johndebugger
  • Joshua V. Dillon
  • Juan A. Navarro P?rez
  • Junpeng Lao
  • Matej Rizman
  • Matthew O'Kelly
  • MG92
  • Nicola De Cao
  • Parsiad Azimzadeh
  • Pavel Sountsov
  • Philip Pham
  • PJ Trainor
  • Rif A. Saurous
  • Sergei Lebedev
  • Sigrid Keydana
  • Sophia Gu
  • Srinivas Vasudevan
  • ykkawana

TensorFlow Probability 0.7.0-rc0

30 May 01:24
a793b96
Compare
Choose a tag to compare
Pre-release

This is the 0.7.0-rc0 release of TensorFlow Probability. It is
tested and stable against TensorFlow version 1.14-rc0 and 2.0.0-alpha

TensorFlow Probability 0.6.0

27 Feb 01:54
2650d40
Compare
Choose a tag to compare

Release notes

This is the 0.6 release of TensorFlow Probability. It is
tested and stable against TensorFlow version 1.13.1.

Change notes

  • Adds tfp.positive_semidefinite_kernels.RationalQuadratic
  • Support float64 in tfpl.MultivariateNormalTriL.
  • Add IndependentLogistic and IndependentPoisson distribution layers.
  • Add make_value_setter interceptor to set values of Edward2 random variables.
  • Implementation of Kalman Smoother, as a member function of LinearGaussianStateSpaceModel.
  • Bijector caching is enabled only in one direction when executing in eager mode. May cause some performance regression in eager mode if repeatedly computing forward(x) or inverse(y) with the same x or y value.
  • Handle rank-0/empty event_shape in tfpl.Independent{Bernoulli,Normal}.
  • Run additional tests in eager mode.
  • quantiles(x, n, ...) added to tfp.stats.
  • Makes tensorflow_probability compatible with Tensorflow 2.0 TensorShape indexing.
  • Use scipy.special functions when testing KL divergence for Chi, Chi2.
  • Add methods to create forecasts from STS models.
  • Add a MixtureSameFamily distribution layer.
  • Add Chi distribution.
  • Fix doc typo tfp.Distribution -> tfd.Distribution.
  • Add Gumbel-Gumbel KL divergence.
  • Add HalfNormal-HalfNormal KL divergence.
  • Add Chi2-Chi2 KL divergence unit tests.
  • Add Exponential-Exponential KL divergence unit tests.
  • Add sampling test for Normal-Normal KL divergence.
  • Add an IndependentNormal distribution layer.
  • Added posterior_marginals to HiddenMarkovModel
  • Add Pareto-Pareto KL divergence.
  • Add LinearRegression component for structural time series models.
  • Add dataset ops to the graph (or create kernels in Eager execution) during the python Dataset object creation instead doing it during Iterator creation time.
  • Text messages HMC benchmark.
  • Add example notebook encoding a switching Poisson process as an HMM for multiple changepoint detection.
  • Require num_adaptation_steps argument to make_simple_step_size_update_policy.
  • s/eight_hmc_schools/eight_schools_hmc/ in printed benchmark string.
  • Add tfp.layers.DistributionLambda to enable plumbing tfd.Distribution instances through Keras models.
  • Adding tfp.math.batch_interp_regular_1d_grid.
  • Update description of fill_triangular to include an in-depth example.
  • Enable bijector/distribution composition, eg, tfb.Exp(tfd.Normal(0,1)).
  • linear and midpoint interpolation added to tfp.stats.percentile.
  • Make distributions include only the bijectors they use.
  • tfp.math.interp_regular_1d_grid added
  • tfp.stats.correlation added (Pearson correlation).
  • Update list of edward2 RVs to include recently added Distributions.
  • Density of continuous Uniform distribution includes the upper endpoint.
  • Add support for batched inputs in tfp.glm.fit_sparse.
  • interp_regular_1d_grid added to tfp.math.
  • Added HiddenMarkovModel distribution.
  • Add Student's T Process.
  • Optimize LinearGaussianStateSpaceModel by avoiding matrix ops when the observations are statically known to be scalar.
  • stddev, cholesky added to tfp.stats.
  • Add methods to fit structual time series models to data with variational inference and HMC.
  • Add Expm1 bijector (Y = Exp(X) - 1).
  • New stats namespace. covariance and variance added to tfp.stats
  • Make all available MCMC kernels compatible with TransformedTransitionKernel.

Huge thanks to all the contributors to this release!

  • Adam Wood
  • Alexey Radul
  • Anudhyan Boral
  • Ashish Saxena
  • Billy Lamberta
  • Brian Patton
  • Christopher Suter
  • Cyril Chimisov
  • Dave Moore
  • Eugene Zhulenev
  • Griffin Tabor
  • Ian Langmore
  • Jacob Burnim
  • Jakub Arnold
  • Jiahao Yao
  • Jihun
  • Jiming Ye
  • Joshua V. Dillon
  • Juan A. Navarro Pérez
  • Julius Kunze
  • Julius Plenz
  • Kristian Hartikainen
  • Kyle Beauchamp
  • Matej Rizman
  • Pavel Sountsov
  • Peter Roelants
  • Rif A. Saurous
  • Rohan Jain
  • Roman Ring
  • Rui Zhao
  • Sergio Guadarrama
  • Shuhei Iitsuka
  • Shuming Hu
  • Srinivas Vasudevan
  • Tabor473
  • ValentinMouret
  • Youngwook Kim
  • Yuki Nagae

TensorFlow Probability 0.6.0-rc1

15 Feb 22:05
8128cb1
Compare
Choose a tag to compare
Pre-release

This is the 0.6.0-rc1 release candidate of TensorFlow Probability. It is tested against TensorFlow 1.13.0-rc2.

TensorFlow Probability 0.6.0-rc0

25 Jan 23:39
0f8b902
Compare
Choose a tag to compare
Pre-release

This is the RC0 release candidate of the TensorFlow Probability 0.6 release.

It is tested against TensorFlow 1.13.0-rc0

TensorFlow Probability 0.5.0

06 Nov 18:03
d1c0f64
Compare
Choose a tag to compare

Release Notes

This is the 0.5.0 release of TensorFlow Probability. It's tested and stable against TensorFlow 1.12.

Packaging Change

As of this release, we no longer package a separate GPU-specific build. Users can select the version of TensorFlow they wish to use (CPU or GPU), and TensorFlow Probability will work with both.

As a result, we no longer explicitly list a TensorFlow dependency in our package requirements (since we can't know which version the user will want). If TFP is installed with no TensorFlow package present, or with an unsupported TensorFlow version, we will issue an ImportError at time of import.

Distributions & Bijectors

  • All Distributions have been relocated from tf.distributions to tfp.distributions (the ones in TF are deprecated and will be deleted in TF 2.0).
  • Add Triangular distribution.
  • Add Zipf distribution.
  • Add NormalCDF Bijector.
  • Add Multivariate Student's t-distribution.
  • Add RationalQuadratic kernel.

Documentation & Examples

  • Add example showing how to fit GLMM using Variational Inference.
  • Introduce Gaussian process latent variable model colab.
  • Introduce Gaussian process regression example colab
  • Add notebook showcasing GLM algorithms and deriving some results about GLMs that those algorithms leverage.

Huge thanks to all the contributors to this release!

  • Akshay Modi
  • Alexey Radul
  • Anudhyan Boral
  • Ashish Saxena
  • Ben Zinberg
  • Billy Lamberta
  • Brian Patton
  • Christopher Suter
  • Dave Moore
  • Ian Langmore
  • Joshua V. Dillon
  • Kristian Hartikainen
  • Malcolm Reynolds
  • Pavel Sountsov
  • Srinivas Vasudevan
  • Xiaojing Wang
  • Yifei Feng