@@ -134,7 +134,7 @@ therefore can "focus on what matters" at the current optimization step. In simpl
Compute the gradient of the MLP w.r.t. the optimization parameters, and use this gradient to perform an (Adam-) gradient descent step.
-We provide a Colab Notebook that shows a simple example and how ZeroGrads outperforms its competitors.
+We provide a Colab Notebook that shows a simple example and how ZeroGrads outperforms its competitors.
@@ -185,14 +185,14 @@ previously published gradient estimator PRDPT, here denoted
We use a weighted mixture of MSE and KLD as the trainig loss and backpropagate this loss through the non-diff. spline
renderer (i.e., from the rendered spline to the VAE weights) using our proposed method ZeroGrads. The below digits
are samples from the latent space of the trained VAE, in a variety of styles, which are easily applicable in post-processing
- thanks to the spline formulation.
+ thanks to the spline formulation.
+
-
The magic that makes our approach scale to high dimensions is the surrogate's hysteresis, which reduces the gradient
diff --git a/_site/feed.xml b/_site/feed.xml
index 8bb78c9..b3221d6 100644
--- a/_site/feed.xml
+++ b/_site/feed.xml
@@ -1,4 +1,4 @@
-Jekyll2024-05-13T11:04:51+01:00http://localhost:4000/feed.xmlMichael FischerMichael Fischer, PhD Student in AI and Computer Graphics at University College London (UCL).Michael FischerWelcome to Jekyll!2019-04-18T20:34:30+01:002019-04-18T20:34:30+01:00http://localhost:4000/blog/welcome-to-jekyll<p>You’ll find this post in your <code class="language-plaintext highlighter-rouge">_posts</code> directory. Go ahead and edit it and re-build the site to see your changes. You can rebuild the site in many different ways, but the most common way is to run <code class="language-plaintext highlighter-rouge">jekyll serve</code>, which launches a web server and auto-regenerates your site when a file is updated.</p>
+Jekyll2024-05-13T12:47:17+01:00http://localhost:4000/feed.xmlMichael FischerMichael Fischer, PhD Student in AI and Computer Graphics at University College London (UCL).Michael FischerWelcome to Jekyll!2019-04-18T20:34:30+01:002019-04-18T20:34:30+01:00http://localhost:4000/blog/welcome-to-jekyll<p>You’ll find this post in your <code class="language-plaintext highlighter-rouge">_posts</code> directory. Go ahead and edit it and re-build the site to see your changes. You can rebuild the site in many different ways, but the most common way is to run <code class="language-plaintext highlighter-rouge">jekyll serve</code>, which launches a web server and auto-regenerates your site when a file is updated.</p>
<p>To add new posts, simply add a file in the <code class="language-plaintext highlighter-rouge">_posts</code> directory that follows the convention <code class="language-plaintext highlighter-rouge">YYYY-MM-DD-name-of-post.ext</code> and includes the necessary front matter. Take a look at the source for this post to get an idea about how it works.</p>
diff --git a/_site/zerograds/index.html b/_site/zerograds/index.html
index 549fb58..c699c5c 100644
--- a/_site/zerograds/index.html
+++ b/_site/zerograds/index.html
@@ -400,9 +400,9 @@
Compute the gradient of the MLP w.r.t. the optimization parameters, and use this gradient to perform an (Adam-) gradient descent step.
-We provide a Colab Notebook that shows a simple example and how ZeroGrads outperforms its competitors.
+We provide a Colab Notebook that shows a simple example and how ZeroGrads outperforms its competitors.
@@ -515,14 +515,14 @@
Supplemental
We use a weighted mixture of MSE and KLD as the trainig loss and backpropagate this loss through the non-diff. spline
renderer (i.e., from the rendered spline to the VAE weights) using our proposed method ZeroGrads. The below digits
are samples from the latent space of the trained VAE, in a variety of styles, which are easily applicable in post-processing
- thanks to the spline formulation.
+ thanks to the spline formulation.
+
-
The magic that makes our approach scale to high dimensions is the surrogate's hysteresis, which reduces the gradient