Skip to content

Commit

Permalink
re-execute blackbox numpy notebook (#496)
Browse files Browse the repository at this point in the history
* re-execute blackbox numpy notebook

* simplify and fix final comparison
  • Loading branch information
OriolAbril authored Feb 16, 2023
1 parent cccc98e commit 14e044d
Show file tree
Hide file tree
Showing 2 changed files with 128 additions and 233 deletions.
322 changes: 120 additions & 202 deletions examples/case_studies/blackbox_external_likelihood_numpy.ipynb

Large diffs are not rendered by default.

39 changes: 8 additions & 31 deletions examples/case_studies/blackbox_external_likelihood_numpy.myst.md
Original file line number Diff line number Diff line change
Expand Up @@ -368,7 +368,7 @@ with pm.Model() as pymodel:
theta = pt.as_tensor_variable([m, c])
# use a Normal distribution
pm.Normal("likelihood", mu=(m * x + c), sd=sigma, observed=data)
y = pm.Normal("likelihood", mu=(m * x + c), sigma=sigma, observed=data)
idata = pm.sample()
Expand Down Expand Up @@ -412,38 +412,14 @@ az.plot_pair(idata, **pair_kwargs, ax=ax);
We can now check that the gradient Op works as expected. First, just create and call the `LogLikeGrad` class, which should return the gradient directly (note that we have to create a [PyTensor function](http://deeplearning.net/software/pytensor/library/compile/function.html) to convert the output of the Op to an array). Secondly, we call the gradient from `LogLikeWithGrad` by using the [PyTensor tensor gradient](http://deeplearning.net/software/pytensor/library/gradient.html#pytensor.gradient.grad) function. Finally, we will check the gradient returned by the PyMC model for a Normal distribution, which should be the same as the log-likelihood function we defined. In all cases we evaluate the gradients at the true values of the model function (the straight line) that was created.

```{code-cell} ipython3
# test the gradient Op by direct call
pytensor.config.compute_test_value = "ignore"
pytensor.config.exception_verbosity = "high"
ip = pymodel.initial_point()
print(f"Evaluating dlogp of model at point\n {ip}")
var = pt.dvector()
test_grad_op = LogLikeGrad(data, x, sigma)
test_grad_op_func = pytensor.function([var], test_grad_op(var))
grad_vals = test_grad_op_func([mtrue, ctrue])
grad_vals_custom = opmodel.compile_dlogp()(ip)
grad_vals_pymc = pymodel.compile_dlogp()(ip)
print(f'Gradient returned by "LogLikeGrad": {grad_vals}')
# test the gradient called through LogLikeWithGrad
test_gradded_op = LogLikeWithGrad(my_loglike, data, x, sigma)
test_gradded_op_grad = pt.grad(test_gradded_op(var), var)
test_gradded_op_grad_func = pytensor.function([var], test_gradded_op_grad)
grad_vals_2 = test_gradded_op_grad_func([mtrue, ctrue])
print(f'Gradient returned by "LogLikeWithGrad": {grad_vals_2}')
# test the gradient that PyMC uses for the Normal log likelihood
test_model = pm.Model()
with test_model:
m = pm.Uniform("m", lower=-10.0, upper=10.0)
c = pm.Uniform("c", lower=-10.0, upper=10.0)
pm.Normal("likelihood", mu=(m * x + c), sigma=sigma, observed=data)
gradfunc = test_model.logp_dlogp_function([m, c], dtype=None)
gradfunc.set_extra_values({"m_interval__": mtrue, "c_interval__": ctrue})
grad_vals_pymc = gradfunc(np.array([mtrue, ctrue]))[1] # get dlogp values
print(f'Gradient returned by PyMC "Normal" distribution: {grad_vals_pymc}')
print(f'\nGradient of model using a custom "LogLikeWithGrad":\n {grad_vals_custom}')
print(f'Gradient of model using a PyMC "Normal" distribution:\n {grad_vals_pymc}')
```

We could also do some profiling to compare performance between implementations. The {ref}`profiling` notebook shows how to do it.
Expand All @@ -454,6 +430,7 @@ We could also do some profiling to compare performance between implementations.

* Adapted from [Jørgen Midtbø](https://github.com/jorgenem/)'s [example](https://discourse.pymc.io/t/connecting-pymc-to-external-code-help-with-understanding-pytensor-custom-ops/670) by Matt Pitkin both as a [blogpost](http://mattpitkin.github.io/samplers-demo/pages/pymc-blackbox-likelihood/) and as an example notebook to this gallery in August, 2018 ([pymc#3169](https://github.com/pymc-devs/pymc/pull/3169) and [pymc#3177](https://github.com/pymc-devs/pymc/pull/3177))
* Updated by [Oriol Abril](https://github.com/OriolAbril) on December 2021 to drop the Cython dependency from the original notebook and use numpy instead ([pymc-examples#28](https://github.com/pymc-devs/pymc-examples/pull/28))
* Re-executed by Oriol Abril with pymc 5.0.0 ([pymc-examples#496](https://github.com/pymc-devs/pymc-examples/pull/496))

+++

Expand Down

0 comments on commit 14e044d

Please sign in to comment.