From 08f576c3b6a2d76d5e45187d2c5b30f6f7c85419 Mon Sep 17 00:00:00 2001 From: Virgile Andreani Date: Tue, 4 Jun 2024 02:04:05 -0400 Subject: [PATCH] Fix broken links --- docs/source/404.md | 1 - docs/source/contributing/developer_guide.md | 8 ++++---- docs/source/learn/books.md | 6 +++--- 3 files changed, 7 insertions(+), 8 deletions(-) diff --git a/docs/source/404.md b/docs/source/404.md index 889347d509c..1c13a8e1348 100644 --- a/docs/source/404.md +++ b/docs/source/404.md @@ -10,4 +10,3 @@ Click on the navigation bar on top of the page to go to the right section of the default docs, or alternatively: * Go to the current [PyMC website homepage](https://www.pymc.io/) -* Go to the homepage of [PyMC 3.x documentation](https://www.pymc.io/projects/docs/en/v3/) diff --git a/docs/source/contributing/developer_guide.md b/docs/source/contributing/developer_guide.md index cc18efc61b2..c79cfa774c0 100644 --- a/docs/source/contributing/developer_guide.md +++ b/docs/source/contributing/developer_guide.md @@ -546,7 +546,7 @@ And the transition kernel (i.e., ``.astep()``) takes an array as input and outpu For example, see the [MH sampler](https://github.com/pymc-devs/pymc/blob/89f6fcf751774fb50016561dc448a87fba7ed3aa/pymc/step_methods/metropolis.py#L235-L289). This is of course very different compared to the transition kernel in e.g. TFP, which is a tenor in tensor out function. -Moreover, transition kernels in TFP do not flatten the tensors, see eg docstring of [tensorflow\_probability/python/mcmc/random\_walk\_metropolis.py](https://github.com/tensorflow/probability/blob/master/tensorflow_probability/python/mcmc/random_walk_metropolis.py): +Moreover, transition kernels in TFP do not flatten the tensors, see eg docstring of [tensorflow\_probability/python/mcmc/random\_walk\_metropolis.py](https://github.com/tensorflow/probability/blob/main/tensorflow_probability/python/mcmc/random_walk_metropolis.py): ```python new_state_fn: Python callable which takes a list of state parts and a @@ -648,7 +648,7 @@ As in the batch random generation, we want to generate (n\_sample, ) + RV.shape In some cases, where we broadcast RV1 and RV2 to create a RV3 that has one more batch shape, we get error (even worse, wrong answer with silent error). The good news is, we are fixing these errors with the amazing works from [lucianopaz](https://github.com/lucianopaz) and others. -The challenge and some summary of the solution could be found in Luciano's [blog post](https://lucianopaz.github.io/2019/08/19/pymc-shape-handling/) +The challenge and some summary of the solution could be found in Luciano's [blog post](https://lucianopaz.github.io/2019/08/19/pymc3-shape-handling/) ```python with pm.Model() as m: @@ -666,8 +666,8 @@ There are also other error related random sample generation (e.g., [Mixture is c ### Extending PyMC - Custom Inference method - - [Inferencing Linear Mixed Model with EM.ipynb](https://github.com/junpenglao/Planet_Sakaar_Data_Science/blob/master/Ports/Inferencing%20Linear%20Mixed%20Model%20with%20EM.ipynb) - - [Laplace approximation in pymc.ipynb](https://github.com/junpenglao/Planet_Sakaar_Data_Science/blob/master/Ports/Laplace%20approximation%20in%20pymc.ipynb) + - [Inferencing Linear Mixed Model with EM.ipynb](https://github.com/junpenglao/Planet_Sakaar_Data_Science/blob/main/Ports/Inferencing%20Linear%20Mixed%20Model%20with%20EM.ipynb) + - [Laplace approximation in pymc.ipynb](https://github.com/junpenglao/Planet_Sakaar_Data_Science/blob/main/Ports/Laplace%20approximation%20in%20pymc3.ipynb) - Connecting it to other library within a model - Using "black box" likelihood function by creating a custom PyTensor Op. - Using emcee diff --git a/docs/source/learn/books.md b/docs/source/learn/books.md index 62ff38b7c48..63151fcb8db 100644 --- a/docs/source/learn/books.md +++ b/docs/source/learn/books.md @@ -16,7 +16,7 @@ Hands on approach with PyMC and ArviZ focusing on the practice of applied statis ::: :::{grid-item-card} Bayesian Methods for Hackers -:img-top: https://camo.githubusercontent.com/4a0aca82ca82efab71747d00db30f3a68de98e82/687474703a2f2f692e696d6775722e636f6d2f36444b596250622e706e673f31 +:img-top: https://www.pearson.com/hipassets/assets/hip/images/bigcovers/0133902838.jpg By Cameron Davidson-Pilon The "hacker" in the title means learn-as-you-code. This hands-on introduction teaches intuitive definitions of the Bayesian approach to statistics, worklflow and decision-making by applying them using PyMC. @@ -28,7 +28,7 @@ The "hacker" in the title means learn-as-you-code. This hands-on introduction t ::: :::{grid-item-card} Bayesian Analysis with Python -:img-top: https://aloctavodia.github.io/img/BAP.jpg +:img-top: https://aloctavodia.github.io/img/BAP.png By Osvaldo Martin @@ -55,7 +55,7 @@ Principled introduction to Bayesian data analysis, with practical exercises. The ::: :::{grid-item-card} Statistical Rethinking -:img-top: http://xcelab.net/rm/wp-content/uploads/2012/01/9781482253443-191x300.jpg +:img-top: https://xcelab.net/rm/sr2edcover-1-187x300.png By Richard McElreath