Skip to content
This repository has been archived by the owner on Aug 8, 2024. It is now read-only.

Commit

Permalink
Deploying to main from @ ivy-llc/ivy@642e471 🚀
Browse files Browse the repository at this point in the history
  • Loading branch information
ivy-seed committed Apr 22, 2024
1 parent e57f657 commit dfcec94
Show file tree
Hide file tree
Showing 15 changed files with 25 additions and 25 deletions.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file modified ivy/.doctrees/docs/stateful/ivy.stateful.layers.doctree
Binary file not shown.
Binary file modified ivy/.doctrees/environment.pickle
Binary file not shown.
Binary file modified ivy/.doctrees/index.doctree
Binary file not shown.
6 changes: 3 additions & 3 deletions ivy/docs/functional/ivy/ivy.functional.ivy.meta.html
Original file line number Diff line number Diff line change
Expand Up @@ -1420,7 +1420,7 @@ <h1>Meta<a class="headerlink" href="#meta" title="Link to this heading">#</a></h
<li><p><strong>variables</strong> (<a class="reference internal" href="../../data_classes/data_classes/ivy.data_classes.container.html#ivy.data_classes.container.container.Container" title="ivy.data_classes.container.container.Container"><code class="xref py py-class docutils literal notranslate"><span class="pre">Container</span></code></a>) – Variables to be optimized during the meta step</p></li>
<li><p><strong>inner_grad_steps</strong> (<code class="xref py py-class docutils literal notranslate"><span class="pre">int</span></code>) – Number of gradient steps to perform during the inner loop.</p></li>
<li><p><strong>inner_learning_rate</strong> (<code class="xref py py-class docutils literal notranslate"><span class="pre">float</span></code>) – The learning rate of the inner loop.</p></li>
<li><p><strong>inner_optimization_step</strong> (<code class="xref py py-data docutils literal notranslate"><span class="pre">Callable</span></code>, default: <code class="docutils literal notranslate"><span class="pre">&lt;function</span> <span class="pre">gradient_descent_update</span> <span class="pre">at</span> <span class="pre">0x7f1616789120&gt;</span></code>) – The function used for the inner loop optimization.
<li><p><strong>inner_optimization_step</strong> (<code class="xref py py-data docutils literal notranslate"><span class="pre">Callable</span></code>, default: <code class="docutils literal notranslate"><span class="pre">&lt;function</span> <span class="pre">gradient_descent_update</span> <span class="pre">at</span> <span class="pre">0x7f57273c9120&gt;</span></code>) – The function used for the inner loop optimization.
Default is ivy.gradient_descent_update.</p></li>
<li><p><strong>inner_batch_fn</strong> (<code class="xref py py-data docutils literal notranslate"><span class="pre">Optional</span></code>[<code class="xref py py-data docutils literal notranslate"><span class="pre">Callable</span></code>], default: <code class="docutils literal notranslate"><span class="pre">None</span></code>) – Function to apply to the task sub-batch, before passing to the inner_cost_fn.
Default is <code class="docutils literal notranslate"><span class="pre">None</span></code>.</p></li>
Expand Down Expand Up @@ -1474,7 +1474,7 @@ <h1>Meta<a class="headerlink" href="#meta" title="Link to this heading">#</a></h
<li><p><strong>variables</strong> (<a class="reference internal" href="../../data_classes/data_classes/ivy.data_classes.container.html#ivy.data_classes.container.container.Container" title="ivy.data_classes.container.container.Container"><code class="xref py py-class docutils literal notranslate"><span class="pre">Container</span></code></a>) – Variables to be optimized during the meta step</p></li>
<li><p><strong>inner_grad_steps</strong> (<code class="xref py py-class docutils literal notranslate"><span class="pre">int</span></code>) – Number of gradient steps to perform during the inner loop.</p></li>
<li><p><strong>inner_learning_rate</strong> (<code class="xref py py-class docutils literal notranslate"><span class="pre">float</span></code>) – The learning rate of the inner loop.</p></li>
<li><p><strong>inner_optimization_step</strong> (<code class="xref py py-data docutils literal notranslate"><span class="pre">Callable</span></code>, default: <code class="docutils literal notranslate"><span class="pre">&lt;function</span> <span class="pre">gradient_descent_update</span> <span class="pre">at</span> <span class="pre">0x7f1616789120&gt;</span></code>) – The function used for the inner loop optimization.
<li><p><strong>inner_optimization_step</strong> (<code class="xref py py-data docutils literal notranslate"><span class="pre">Callable</span></code>, default: <code class="docutils literal notranslate"><span class="pre">&lt;function</span> <span class="pre">gradient_descent_update</span> <span class="pre">at</span> <span class="pre">0x7f57273c9120&gt;</span></code>) – The function used for the inner loop optimization.
Default is ivy.gradient_descent_update.</p></li>
<li><p><strong>inner_batch_fn</strong> (<code class="xref py py-data docutils literal notranslate"><span class="pre">Optional</span></code>[<code class="xref py py-data docutils literal notranslate"><span class="pre">Callable</span></code>], default: <code class="docutils literal notranslate"><span class="pre">None</span></code>) – Function to apply to the task sub-batch, before passing to the inner_cost_fn.
Default is <code class="docutils literal notranslate"><span class="pre">None</span></code>.</p></li>
Expand Down Expand Up @@ -1551,7 +1551,7 @@ <h1>Meta<a class="headerlink" href="#meta" title="Link to this heading">#</a></h
<li><p><strong>variables</strong> (<a class="reference internal" href="../../data_classes/data_classes/ivy.data_classes.container.html#ivy.data_classes.container.container.Container" title="ivy.data_classes.container.container.Container"><code class="xref py py-class docutils literal notranslate"><span class="pre">Container</span></code></a>) – Variables to be optimized.</p></li>
<li><p><strong>inner_grad_steps</strong> (<code class="xref py py-class docutils literal notranslate"><span class="pre">int</span></code>) – Number of gradient steps to perform during the inner loop.</p></li>
<li><p><strong>inner_learning_rate</strong> (<code class="xref py py-class docutils literal notranslate"><span class="pre">float</span></code>) – The learning rate of the inner loop.</p></li>
<li><p><strong>inner_optimization_step</strong> (<code class="xref py py-data docutils literal notranslate"><span class="pre">Callable</span></code>, default: <code class="docutils literal notranslate"><span class="pre">&lt;function</span> <span class="pre">gradient_descent_update</span> <span class="pre">at</span> <span class="pre">0x7f1616789120&gt;</span></code>) – The function used for the inner loop optimization. It takes the learnable
<li><p><strong>inner_optimization_step</strong> (<code class="xref py py-data docutils literal notranslate"><span class="pre">Callable</span></code>, default: <code class="docutils literal notranslate"><span class="pre">&lt;function</span> <span class="pre">gradient_descent_update</span> <span class="pre">at</span> <span class="pre">0x7f57273c9120&gt;</span></code>) – The function used for the inner loop optimization. It takes the learnable
weights,the derivative of the cost with respect to the weights, and the learning
rate as arguments, and returns the updated variables.
Default is <cite>gradient_descent_update</cite>.</p></li>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -1423,7 +1423,7 @@ <h1>fomaml_step<a class="headerlink" href="#fomaml-step" title="Link to this hea
<li><p><strong>variables</strong> (<a class="reference internal" href="../../../data_classes/data_classes/ivy.data_classes.container.html#ivy.data_classes.container.container.Container" title="ivy.data_classes.container.container.Container"><code class="xref py py-class docutils literal notranslate"><span class="pre">Container</span></code></a>) – Variables to be optimized during the meta step</p></li>
<li><p><strong>inner_grad_steps</strong> (<code class="xref py py-class docutils literal notranslate"><span class="pre">int</span></code>) – Number of gradient steps to perform during the inner loop.</p></li>
<li><p><strong>inner_learning_rate</strong> (<code class="xref py py-class docutils literal notranslate"><span class="pre">float</span></code>) – The learning rate of the inner loop.</p></li>
<li><p><strong>inner_optimization_step</strong> (<code class="xref py py-data docutils literal notranslate"><span class="pre">Callable</span></code>, default: <code class="docutils literal notranslate"><span class="pre">&lt;function</span> <span class="pre">gradient_descent_update</span> <span class="pre">at</span> <span class="pre">0x7f1616789120&gt;</span></code>) – The function used for the inner loop optimization.
<li><p><strong>inner_optimization_step</strong> (<code class="xref py py-data docutils literal notranslate"><span class="pre">Callable</span></code>, default: <code class="docutils literal notranslate"><span class="pre">&lt;function</span> <span class="pre">gradient_descent_update</span> <span class="pre">at</span> <span class="pre">0x7f57273c9120&gt;</span></code>) – The function used for the inner loop optimization.
Default is ivy.gradient_descent_update.</p></li>
<li><p><strong>inner_batch_fn</strong> (<code class="xref py py-data docutils literal notranslate"><span class="pre">Optional</span></code>[<code class="xref py py-data docutils literal notranslate"><span class="pre">Callable</span></code>], default: <code class="docutils literal notranslate"><span class="pre">None</span></code>) – Function to apply to the task sub-batch, before passing to the inner_cost_fn.
Default is <code class="docutils literal notranslate"><span class="pre">None</span></code>.</p></li>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -1423,7 +1423,7 @@ <h1>maml_step<a class="headerlink" href="#maml-step" title="Link to this heading
<li><p><strong>variables</strong> (<a class="reference internal" href="../../../data_classes/data_classes/ivy.data_classes.container.html#ivy.data_classes.container.container.Container" title="ivy.data_classes.container.container.Container"><code class="xref py py-class docutils literal notranslate"><span class="pre">Container</span></code></a>) – Variables to be optimized during the meta step</p></li>
<li><p><strong>inner_grad_steps</strong> (<code class="xref py py-class docutils literal notranslate"><span class="pre">int</span></code>) – Number of gradient steps to perform during the inner loop.</p></li>
<li><p><strong>inner_learning_rate</strong> (<code class="xref py py-class docutils literal notranslate"><span class="pre">float</span></code>) – The learning rate of the inner loop.</p></li>
<li><p><strong>inner_optimization_step</strong> (<code class="xref py py-data docutils literal notranslate"><span class="pre">Callable</span></code>, default: <code class="docutils literal notranslate"><span class="pre">&lt;function</span> <span class="pre">gradient_descent_update</span> <span class="pre">at</span> <span class="pre">0x7f1616789120&gt;</span></code>) – The function used for the inner loop optimization.
<li><p><strong>inner_optimization_step</strong> (<code class="xref py py-data docutils literal notranslate"><span class="pre">Callable</span></code>, default: <code class="docutils literal notranslate"><span class="pre">&lt;function</span> <span class="pre">gradient_descent_update</span> <span class="pre">at</span> <span class="pre">0x7f57273c9120&gt;</span></code>) – The function used for the inner loop optimization.
Default is ivy.gradient_descent_update.</p></li>
<li><p><strong>inner_batch_fn</strong> (<code class="xref py py-data docutils literal notranslate"><span class="pre">Optional</span></code>[<code class="xref py py-data docutils literal notranslate"><span class="pre">Callable</span></code>], default: <code class="docutils literal notranslate"><span class="pre">None</span></code>) – Function to apply to the task sub-batch, before passing to the inner_cost_fn.
Default is <code class="docutils literal notranslate"><span class="pre">None</span></code>.</p></li>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -1420,7 +1420,7 @@ <h1>reptile_step<a class="headerlink" href="#reptile-step" title="Link to this h
<li><p><strong>variables</strong> (<a class="reference internal" href="../../../data_classes/data_classes/ivy.data_classes.container.html#ivy.data_classes.container.container.Container" title="ivy.data_classes.container.container.Container"><code class="xref py py-class docutils literal notranslate"><span class="pre">Container</span></code></a>) – Variables to be optimized.</p></li>
<li><p><strong>inner_grad_steps</strong> (<code class="xref py py-class docutils literal notranslate"><span class="pre">int</span></code>) – Number of gradient steps to perform during the inner loop.</p></li>
<li><p><strong>inner_learning_rate</strong> (<code class="xref py py-class docutils literal notranslate"><span class="pre">float</span></code>) – The learning rate of the inner loop.</p></li>
<li><p><strong>inner_optimization_step</strong> (<code class="xref py py-data docutils literal notranslate"><span class="pre">Callable</span></code>, default: <code class="docutils literal notranslate"><span class="pre">&lt;function</span> <span class="pre">gradient_descent_update</span> <span class="pre">at</span> <span class="pre">0x7f1616789120&gt;</span></code>) – The function used for the inner loop optimization. It takes the learnable
<li><p><strong>inner_optimization_step</strong> (<code class="xref py py-data docutils literal notranslate"><span class="pre">Callable</span></code>, default: <code class="docutils literal notranslate"><span class="pre">&lt;function</span> <span class="pre">gradient_descent_update</span> <span class="pre">at</span> <span class="pre">0x7f57273c9120&gt;</span></code>) – The function used for the inner loop optimization. It takes the learnable
weights,the derivative of the cost with respect to the weights, and the learning
rate as arguments, and returns the updated variables.
Default is <cite>gradient_descent_update</cite>.</p></li>
Expand Down
2 changes: 1 addition & 1 deletion ivy/docs/helpers/ivy_tests.test_ivy.helpers.globals.html
Original file line number Diff line number Diff line change
Expand Up @@ -1409,7 +1409,7 @@
<p>Should not be used inside any of the test functions.</p>
<dl class="py data">
<dt class="sig sig-object py" id="ivy_tests.test_ivy.helpers.globals.CURRENT_FRONTEND_CONFIG">
<span class="sig-prename descclassname"><span class="pre">ivy_tests.test_ivy.helpers.globals.</span></span><span class="sig-name descname"><span class="pre">CURRENT_FRONTEND_CONFIG</span></span><em class="property"><span class="p"><span class="pre">:</span></span><span class="w"> </span><span class="pre">&lt;object</span> <span class="pre">object</span> <span class="pre">at</span> <span class="pre">0x7f160a001f90&gt;</span></em><a class="headerlink" href="#ivy_tests.test_ivy.helpers.globals.CURRENT_FRONTEND_CONFIG" title="Link to this definition">#</a></dt>
<span class="sig-prename descclassname"><span class="pre">ivy_tests.test_ivy.helpers.globals.</span></span><span class="sig-name descname"><span class="pre">CURRENT_FRONTEND_CONFIG</span></span><em class="property"><span class="p"><span class="pre">:</span></span><span class="w"> </span><span class="pre">&lt;object</span> <span class="pre">object</span> <span class="pre">at</span> <span class="pre">0x7f571ac39f90&gt;</span></em><a class="headerlink" href="#ivy_tests.test_ivy.helpers.globals.CURRENT_FRONTEND_CONFIG" title="Link to this definition">#</a></dt>
<dd></dd></dl>

<dl class="py exception">
Expand Down
Loading

0 comments on commit dfcec94

Please sign in to comment.