Skip to content

Commit

Permalink
Deploying to gh-pages from @ fa4723f πŸš€
Browse files Browse the repository at this point in the history
  • Loading branch information
slinderman committed Jun 20, 2023
1 parent 5823bf2 commit 5331340
Show file tree
Hide file tree
Showing 399 changed files with 21,249 additions and 24,536 deletions.
Binary file modified .doctrees/api.doctree
Binary file not shown.
Binary file modified .doctrees/environment.pickle
Binary file not shown.
Binary file modified .doctrees/index.doctree
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file modified .doctrees/notebooks/hmm/autoregressive_hmm.doctree
Binary file not shown.
Binary file modified .doctrees/notebooks/hmm/casino_hmm_inference.doctree
Binary file not shown.
Binary file modified .doctrees/notebooks/hmm/casino_hmm_learning.doctree
Binary file not shown.
Binary file modified .doctrees/notebooks/hmm/gaussian_hmm.doctree
Binary file not shown.
Binary file modified .doctrees/notebooks/linear_gaussian_ssm/kf_linreg.doctree
Binary file not shown.
Binary file modified .doctrees/notebooks/linear_gaussian_ssm/kf_tracking.doctree
Binary file not shown.
Binary file modified .doctrees/notebooks/linear_gaussian_ssm/lgssm_hmc.doctree
Binary file not shown.
Binary file modified .doctrees/notebooks/linear_gaussian_ssm/lgssm_learning.doctree
Binary file not shown.
Binary file not shown.
Binary file modified .doctrees/notebooks/nonlinear_gaussian_ssm/ekf_mlp.doctree
Binary file not shown.
Binary file modified .doctrees/notebooks/nonlinear_gaussian_ssm/ekf_ukf_pendulum.doctree
Binary file not shown.
Binary file not shown.
Binary file added .doctrees/types.doctree
Binary file not shown.
540 changes: 291 additions & 249 deletions README.html

Large diffs are not rendered by default.

Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file not shown.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Diff not rendered.
Diff not rendered.
Diff not rendered.
Diff not rendered.
Diff not rendered.
Diff not rendered.
Diff not rendered.
Diff not rendered.
Diff not rendered.
Diff not rendered.
Diff not rendered.
Diff not rendered.
Diff not rendered.
Diff not rendered.
Diff not rendered.
Diff not rendered.
Diff not rendered.
Diff not rendered.
Diff not rendered.
Diff not rendered.
Diff not rendered.
Diff not rendered.
Diff not rendered.
Diff not rendered.
Diff not rendered.
Diff not rendered.
Diff not rendered.
Diff not rendered.
Diff not rendered.
Diff not rendered.
Diff not rendered.
Diff not rendered.
Diff not rendered.
Diff not rendered.
Diff not rendered.
Diff not rendered.
Diff not rendered.
Diff not rendered.
Diff not rendered.
Diff not rendered.
Diff not rendered.
Diff not rendered.
Diff not rendered.
Diff not rendered.
Diff not rendered.
Diff not rendered.
Diff not rendered.
558 changes: 300 additions & 258 deletions _modules/dynamax/generalized_gaussian_ssm/inference.html

Large diffs are not rendered by default.

542 changes: 292 additions & 250 deletions _modules/dynamax/generalized_gaussian_ssm/models.html

Large diffs are not rendered by default.

574 changes: 310 additions & 264 deletions _modules/dynamax/hidden_markov_model/inference.html

Large diffs are not rendered by default.

592 changes: 317 additions & 275 deletions _modules/dynamax/hidden_markov_model/models/abstractions.html

Large diffs are not rendered by default.

548 changes: 295 additions & 253 deletions _modules/dynamax/hidden_markov_model/models/arhmm.html

Large diffs are not rendered by default.

546 changes: 294 additions & 252 deletions _modules/dynamax/hidden_markov_model/models/bernoulli_hmm.html

Large diffs are not rendered by default.

546 changes: 294 additions & 252 deletions _modules/dynamax/hidden_markov_model/models/categorical_glm_hmm.html

Large diffs are not rendered by default.

546 changes: 294 additions & 252 deletions _modules/dynamax/hidden_markov_model/models/categorical_hmm.html

Large diffs are not rendered by default.

496 changes: 496 additions & 0 deletions _modules/dynamax/hidden_markov_model/models/gamma_hmm.html

Large diffs are not rendered by default.

570 changes: 306 additions & 264 deletions _modules/dynamax/hidden_markov_model/models/gaussian_hmm.html

Large diffs are not rendered by default.

550 changes: 296 additions & 254 deletions _modules/dynamax/hidden_markov_model/models/gmm_hmm.html

Large diffs are not rendered by default.

544 changes: 293 additions & 251 deletions _modules/dynamax/hidden_markov_model/models/linreg_hmm.html

Large diffs are not rendered by default.

542 changes: 292 additions & 250 deletions _modules/dynamax/hidden_markov_model/models/logreg_hmm.html

Large diffs are not rendered by default.

542 changes: 292 additions & 250 deletions _modules/dynamax/hidden_markov_model/models/multinomial_hmm.html

Large diffs are not rendered by default.

544 changes: 293 additions & 251 deletions _modules/dynamax/hidden_markov_model/models/poisson_hmm.html

Large diffs are not rendered by default.

826 changes: 504 additions & 322 deletions _modules/dynamax/linear_gaussian_ssm/inference.html

Large diffs are not rendered by default.

554 changes: 298 additions & 256 deletions _modules/dynamax/linear_gaussian_ssm/models.html

Large diffs are not rendered by default.

653 changes: 391 additions & 262 deletions _modules/dynamax/nonlinear_gaussian_ssm/inference_ekf.html

Large diffs are not rendered by default.

584 changes: 322 additions & 262 deletions _modules/dynamax/nonlinear_gaussian_ssm/inference_ukf.html

Large diffs are not rendered by default.

542 changes: 292 additions & 250 deletions _modules/dynamax/nonlinear_gaussian_ssm/models.html

Large diffs are not rendered by default.

550 changes: 296 additions & 254 deletions _modules/dynamax/parameters.html

Large diffs are not rendered by default.

576 changes: 309 additions & 267 deletions _modules/dynamax/ssm.html

Large diffs are not rendered by default.

555 changes: 301 additions & 254 deletions _modules/dynamax/utils/utils.html

Large diffs are not rendered by default.

539 changes: 291 additions & 248 deletions _modules/index.html

Large diffs are not rendered by default.

File renamed without changes.
4 changes: 4 additions & 0 deletions _sources/api.rst.txt β†’ _sources/api.rst
Original file line number Diff line number Diff line change
Expand Up @@ -74,6 +74,10 @@ default to weak priors without any stickiness.
:show-inheritance:
:members: initialize

.. autoclass:: dynamax.hidden_markov_model.GammaHMM
:show-inheritance:
:members: initialize

.. autoclass:: dynamax.hidden_markov_model.GaussianHMM
:show-inheritance:
:members: initialize
Expand Down
1 change: 1 addition & 0 deletions _sources/index.rst.txt β†’ _sources/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -229,6 +229,7 @@ API documentation
:maxdepth: 3
:caption: API Documentation

types
api


Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -628,7 +628,7 @@
"source": [
"## Conclusion\n",
"\n",
"This notebook showed how to construct a simple categorical HMM, initialize its parameters, sample data, and use it to perform state inference (i.e. filtering, smoothing, and finding the most likely states). However, often we do not know the model paramters and instead need to estimate them from data. The next notebook shows how to do exactly that."
"This notebook showed how to construct a simple categorical HMM, initialize its parameters, sample data, and use it to perform state inference (i.e. filtering, smoothing, and finding the most likely states). However, often we do not know the model parameters and instead need to estimate them from data. The next notebook shows how to do exactly that."
]
}
],
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@
"This notebook continues the \"occasionally dishonest casino\" example from the preceding notebook.\n",
"There, we assumed we knew the parameters of the model: the probability of switching between fair and loaded dice and the probabilities of the different outcomes (1,...,6) for each die. \n",
"\n",
"Here, our goal is **learn these parameters from data**. We will sample data from the model as before, but now we will estimate the parameters using eith stochastic gradient descent (SGD) or expectation-maximization (EM).\n",
"Here, our goal is **learn these parameters from data**. We will sample data from the model as before, but now we will estimate the parameters using either stochastic gradient descent (SGD) or expectation-maximization (EM).\n",
"\n",
"The figure below shows the _graphical model_, complete with the parameter nodes.\n",
"<p align=\"center\">\n",
Expand All @@ -31,7 +31,7 @@
"\\end{align*}\n",
"The hyperparameters can be specified in the [`CategoricalHMM`](https://probml.github.io/dynamax/api.html#dynamax.hidden_markov_model.CategoricalHMM) constructor..\n",
"\n",
"The **learning objective** is to find parameters tham maximize the marginal probability,\n",
"The **learning objective** is to find parameters that maximize the marginal probability,\n",
"\\begin{align*}\n",
"\\theta^\\star &= \\text{arg max}_{\\theta} \\; p(\\theta \\mid y_{1:T}) \\\\\n",
"&= \\text{arg max}_{\\theta} \\; p(\\theta, y_{1:T})\n",
Expand Down Expand Up @@ -397,7 +397,7 @@
}
],
"source": [
"# Print the paramters after learning\n",
"# Print the parameters after learning\n",
"print(\"Full batch gradient descent params:\")\n",
"print_params(fbgd_params)\n",
"print(\"\")\n",
Expand Down Expand Up @@ -560,29 +560,29 @@
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"Not only does EM converge much faster on this example (here, in only a handful of iterations), it also converges to a better estimate of the parameters. Indeed, it essentially matches the loss obtained by the parameters that truly generated the data. We see that its parameter estimates are nearly the same as the true parameters."
"Not only does EM converge much faster on this example (here, in only a handful of iterations), it also converges to a better estimate of the parameters. Indeed, it essentially matches the loss obtained by the parameters that truly generated the data. We see that its parameter estimates are nearly the same as the true parameters, up to label switching. \n",
"\n",
"(Label switching refers to the fact that the generated parameters assume state 1 corresponds to the loaded die, whereas the learned parameters assume this is state 0; since these solutions have the same likelihood, and since the prior is also symmetrical, there are two equally good posterior modes, and EM will just find one of them. When you compare inferred parameters or states between models, you may need to use our [find_permutation](https://probml.github.io/dynamax/api.html#dynamax.utils.utils.find_permutation) function to find the best correspondence between discrete latent labels.)"
]
},
{
"cell_type": "code",
"execution_count": 13,
"execution_count": 1,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"initial probs:\n",
"[0.628 0.372]\n",
"transition matrix:\n",
"[[0.909 0.091]\n",
" [0.053 0.947]]\n",
"emission probs:\n",
"[[0.110 0.106 0.101 0.110 0.105 0.468]\n",
" [0.171 0.173 0.171 0.164 0.164 0.157]]\n"
"ename": "NameError",
"evalue": "name 'print_params' is not defined",
"output_type": "error",
"traceback": [
"\u001b[0;31m---------------------------------------------------------------------------\u001b[0m",
"\u001b[0;31mNameError\u001b[0m Traceback (most recent call last)",
"\u001b[1;32m/Users/kpmurphy/github/dynamax/docs/notebooks/hmm/casino_hmm_learning.ipynb Cell 27\u001b[0m in \u001b[0;36m<cell line: 1>\u001b[0;34m()\u001b[0m\n\u001b[0;32m----> <a href='vscode-notebook-cell:/Users/kpmurphy/github/dynamax/docs/notebooks/hmm/casino_hmm_learning.ipynb#X35sZmlsZQ%3D%3D?line=0'>1</a>\u001b[0m print_params(em_params)\n",
"\u001b[0;31mNameError\u001b[0m: name 'print_params' is not defined"
]
}
],
Expand Down Expand Up @@ -613,7 +613,7 @@
"provenance": []
},
"kernelspec": {
"display_name": "Python 3.9.6",
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
Expand All @@ -631,7 +631,7 @@
},
"vscode": {
"interpreter": {
"hash": "92401d49bd83f70620e540b668b7091047c2ca041ed5ff6cc3954130de9aa1fc"
"hash": "31f2aee4e71d21fbe5cf8b01ff0e069b9275f58929596ceb00d14d90e3e16cd6"
}
},
"widgets": {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -160,7 +160,9 @@
"\n",
"# Specify parameters of the HMM\n",
"initial_probs = jnp.ones(true_num_states) / true_num_states\n",
"transition_matrix = 0.8 * jnp.eye(true_num_states) + 0.2 * jnp.roll(jnp.eye(true_num_states), 1, axis=1)\n",
"transition_matrix = 0.80 * jnp.eye(true_num_states) \\\n",
" + 0.15 * jnp.roll(jnp.eye(true_num_states), 1, axis=1) \\\n",
" + 0.05 / true_num_states\n",
"emission_means = jnp.column_stack([\n",
" jnp.cos(jnp.linspace(0, 2 * jnp.pi, true_num_states + 1))[:-1],\n",
" jnp.sin(jnp.linspace(0, 2 * jnp.pi, true_num_states + 1))[:-1],\n",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,7 @@
]
},
{
"attachments": {},
"cell_type": "markdown",
"id": "17072519",
"metadata": {},
Expand Down Expand Up @@ -64,7 +65,7 @@
"0 & 0 & 1 & 0\n",
" \\end{pmatrix}\n",
" }_{H}\n",
" \n",
" \\;\n",
"\\underbrace{\\begin{pmatrix} u_t\\\\ \\dot{u}_t \\\\ v_t \\\\ \\dot{v}_t \\end{pmatrix}}_{z_t} \n",
" + r_t\n",
"\\end{align*}\n",
Expand Down Expand Up @@ -586,7 +587,7 @@
"provenance": []
},
"kernelspec": {
"display_name": "Python 3.9.6 ('dynamax')",
"display_name": "Python 3.8.10 64-bit",
"language": "python",
"name": "python3"
},
Expand All @@ -600,11 +601,11 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.6"
"version": "3.8.10"
},
"vscode": {
"interpreter": {
"hash": "92401d49bd83f70620e540b668b7091047c2ca041ed5ff6cc3954130de9aa1fc"
"hash": "31f2aee4e71d21fbe5cf8b01ff0e069b9275f58929596ceb00d14d90e3e16cd6"
}
}
},
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -109,8 +109,8 @@
" label=\"smoothed\" if i == 0 else None)[0]\n",
" plt.fill_between(\n",
" jnp.arange(num_timesteps),\n",
" spc * i + smoothed_emissions[:, i] - 2 * jnp.sqrt(smoothed_emissions_std[:, i]),\n",
" spc * i + smoothed_emissions[:, i] + 2 * jnp.sqrt(smoothed_emissions_std[:, i]),\n",
" spc * i + smoothed_emissions[:, i] - 2 * smoothed_emissions_std[:, i],\n",
" spc * i + smoothed_emissions[:, i] + 2 * smoothed_emissions_std[:, i],\n",
" color=ln.get_color(),\n",
" alpha=0.25,\n",
" )\n",
Expand Down Expand Up @@ -758,7 +758,7 @@
],
"metadata": {
"kernelspec": {
"display_name": "Python 3.9.6 ('dynamax')",
"display_name": "base",
"language": "python",
"name": "python3"
},
Expand All @@ -772,12 +772,12 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.6"
"version": "3.8.5 (default, Sep 4 2020, 02:22:02) \n[Clang 10.0.0 ]"
},
"orig_nbformat": 4,
"vscode": {
"interpreter": {
"hash": "92401d49bd83f70620e540b668b7091047c2ca041ed5ff6cc3954130de9aa1fc"
"hash": "40d3a090f54c6569ab1632332b64b2c03c39dcf918b08424e98f38b5ae0af88f"
}
}
},
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -157,8 +157,8 @@
" label=\"smoothed\" if i == 0 else None)[0]\n",
" plt.fill_between(\n",
" jnp.arange(num_timesteps),\n",
" spc * i + smoothed_emissions[:, i] - 2 * jnp.sqrt(smoothed_emissions_std[i]),\n",
" spc * i + smoothed_emissions[:, i] + 2 * jnp.sqrt(smoothed_emissions_std[i]),\n",
" spc * i + smoothed_emissions[:, i] - 2 * smoothed_emissions_std[i],\n",
" spc * i + smoothed_emissions[:, i] + 2 * smoothed_emissions_std[i],\n",
" color=ln.get_color(),\n",
" alpha=0.25,\n",
" )\n",
Expand Down Expand Up @@ -350,7 +350,7 @@
"provenance": []
},
"kernelspec": {
"display_name": "Python 3.9.6 ('dynamax')",
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
Expand All @@ -364,11 +364,11 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.6"
"version": "3.9.6 (default, Jun 29 2021, 05:25:02) \n[Clang 12.0.5 (clang-1205.0.22.9)]"
},
"vscode": {
"interpreter": {
"hash": "92401d49bd83f70620e540b668b7091047c2ca041ed5ff6cc3954130de9aa1fc"
"hash": "aee8b7b246df8f9039afb4144a1f6fd8d2ca17a180786b69acc140d282b71a49"
}
}
},
Expand Down
Loading

0 comments on commit 5331340

Please sign in to comment.