Skip to content

Commit

Permalink
Deploying to main from @ ivy-llc/ivy@d5247ae 🚀
Browse files Browse the repository at this point in the history
  • Loading branch information
ivy-dev-bot committed Sep 24, 2024
1 parent 696cc54 commit 9d297ae
Show file tree
Hide file tree
Showing 17 changed files with 28 additions and 28 deletions.
Binary file modified .doctrees/docs/functional/ivy/ivy.functional.ivy.meta.doctree
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file modified .doctrees/docs/stateful/ivy.stateful.layers.doctree
Binary file not shown.
Binary file modified .doctrees/docs/stateful/ivy.stateful.utilities.doctree
Binary file not shown.
Binary file modified .doctrees/environment.pickle
Binary file not shown.
Binary file modified .doctrees/index.doctree
Binary file not shown.
6 changes: 3 additions & 3 deletions docs/functional/ivy/ivy.functional.ivy.meta.html
Original file line number Diff line number Diff line change
Expand Up @@ -1334,7 +1334,7 @@ <h1>Meta<a class="headerlink" href="#meta" title="Link to this heading">#</a></h
<li><p><strong>variables</strong> (<a class="reference internal" href="../../data_classes/data_classes/ivy.data_classes.container.html#ivy.data_classes.container.container.Container" title="ivy.data_classes.container.container.Container"><code class="xref py py-class docutils literal notranslate"><span class="pre">Container</span></code></a>) – Variables to be optimized during the meta step</p></li>
<li><p><strong>inner_grad_steps</strong> (<code class="xref py py-class docutils literal notranslate"><span class="pre">int</span></code>) – Number of gradient steps to perform during the inner loop.</p></li>
<li><p><strong>inner_learning_rate</strong> (<code class="xref py py-class docutils literal notranslate"><span class="pre">float</span></code>) – The learning rate of the inner loop.</p></li>
<li><p><strong>inner_optimization_step</strong> (<code class="xref py py-data docutils literal notranslate"><span class="pre">Callable</span></code>, default: <code class="docutils literal notranslate"><span class="pre">&lt;function</span> <span class="pre">gradient_descent_update</span> <span class="pre">at</span> <span class="pre">0x7ffbbced48b0&gt;</span></code>) – The function used for the inner loop optimization.
<li><p><strong>inner_optimization_step</strong> (<code class="xref py py-data docutils literal notranslate"><span class="pre">Callable</span></code>, default: <code class="docutils literal notranslate"><span class="pre">&lt;function</span> <span class="pre">gradient_descent_update</span> <span class="pre">at</span> <span class="pre">0x7fd7f95ac8b0&gt;</span></code>) – The function used for the inner loop optimization.
Default is ivy.gradient_descent_update.</p></li>
<li><p><strong>inner_batch_fn</strong> (<code class="xref py py-data docutils literal notranslate"><span class="pre">Optional</span></code>[<code class="xref py py-data docutils literal notranslate"><span class="pre">Callable</span></code>], default: <code class="docutils literal notranslate"><span class="pre">None</span></code>) – Function to apply to the task sub-batch, before passing to the inner_cost_fn.
Default is <code class="docutils literal notranslate"><span class="pre">None</span></code>.</p></li>
Expand Down Expand Up @@ -1388,7 +1388,7 @@ <h1>Meta<a class="headerlink" href="#meta" title="Link to this heading">#</a></h
<li><p><strong>variables</strong> (<a class="reference internal" href="../../data_classes/data_classes/ivy.data_classes.container.html#ivy.data_classes.container.container.Container" title="ivy.data_classes.container.container.Container"><code class="xref py py-class docutils literal notranslate"><span class="pre">Container</span></code></a>) – Variables to be optimized during the meta step</p></li>
<li><p><strong>inner_grad_steps</strong> (<code class="xref py py-class docutils literal notranslate"><span class="pre">int</span></code>) – Number of gradient steps to perform during the inner loop.</p></li>
<li><p><strong>inner_learning_rate</strong> (<code class="xref py py-class docutils literal notranslate"><span class="pre">float</span></code>) – The learning rate of the inner loop.</p></li>
<li><p><strong>inner_optimization_step</strong> (<code class="xref py py-data docutils literal notranslate"><span class="pre">Callable</span></code>, default: <code class="docutils literal notranslate"><span class="pre">&lt;function</span> <span class="pre">gradient_descent_update</span> <span class="pre">at</span> <span class="pre">0x7ffbbced48b0&gt;</span></code>) – The function used for the inner loop optimization.
<li><p><strong>inner_optimization_step</strong> (<code class="xref py py-data docutils literal notranslate"><span class="pre">Callable</span></code>, default: <code class="docutils literal notranslate"><span class="pre">&lt;function</span> <span class="pre">gradient_descent_update</span> <span class="pre">at</span> <span class="pre">0x7fd7f95ac8b0&gt;</span></code>) – The function used for the inner loop optimization.
Default is ivy.gradient_descent_update.</p></li>
<li><p><strong>inner_batch_fn</strong> (<code class="xref py py-data docutils literal notranslate"><span class="pre">Optional</span></code>[<code class="xref py py-data docutils literal notranslate"><span class="pre">Callable</span></code>], default: <code class="docutils literal notranslate"><span class="pre">None</span></code>) – Function to apply to the task sub-batch, before passing to the inner_cost_fn.
Default is <code class="docutils literal notranslate"><span class="pre">None</span></code>.</p></li>
Expand Down Expand Up @@ -1465,7 +1465,7 @@ <h1>Meta<a class="headerlink" href="#meta" title="Link to this heading">#</a></h
<li><p><strong>variables</strong> (<a class="reference internal" href="../../data_classes/data_classes/ivy.data_classes.container.html#ivy.data_classes.container.container.Container" title="ivy.data_classes.container.container.Container"><code class="xref py py-class docutils literal notranslate"><span class="pre">Container</span></code></a>) – Variables to be optimized.</p></li>
<li><p><strong>inner_grad_steps</strong> (<code class="xref py py-class docutils literal notranslate"><span class="pre">int</span></code>) – Number of gradient steps to perform during the inner loop.</p></li>
<li><p><strong>inner_learning_rate</strong> (<code class="xref py py-class docutils literal notranslate"><span class="pre">float</span></code>) – The learning rate of the inner loop.</p></li>
<li><p><strong>inner_optimization_step</strong> (<code class="xref py py-data docutils literal notranslate"><span class="pre">Callable</span></code>, default: <code class="docutils literal notranslate"><span class="pre">&lt;function</span> <span class="pre">gradient_descent_update</span> <span class="pre">at</span> <span class="pre">0x7ffbbced48b0&gt;</span></code>) – The function used for the inner loop optimization. It takes the learnable
<li><p><strong>inner_optimization_step</strong> (<code class="xref py py-data docutils literal notranslate"><span class="pre">Callable</span></code>, default: <code class="docutils literal notranslate"><span class="pre">&lt;function</span> <span class="pre">gradient_descent_update</span> <span class="pre">at</span> <span class="pre">0x7fd7f95ac8b0&gt;</span></code>) – The function used for the inner loop optimization. It takes the learnable
weights,the derivative of the cost with respect to the weights, and the learning
rate as arguments, and returns the updated variables.
Default is <cite>gradient_descent_update</cite>.</p></li>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -1337,7 +1337,7 @@ <h1>fomaml_step<a class="headerlink" href="#fomaml-step" title="Link to this hea
<li><p><strong>variables</strong> (<a class="reference internal" href="../../../data_classes/data_classes/ivy.data_classes.container.html#ivy.data_classes.container.container.Container" title="ivy.data_classes.container.container.Container"><code class="xref py py-class docutils literal notranslate"><span class="pre">Container</span></code></a>) – Variables to be optimized during the meta step</p></li>
<li><p><strong>inner_grad_steps</strong> (<code class="xref py py-class docutils literal notranslate"><span class="pre">int</span></code>) – Number of gradient steps to perform during the inner loop.</p></li>
<li><p><strong>inner_learning_rate</strong> (<code class="xref py py-class docutils literal notranslate"><span class="pre">float</span></code>) – The learning rate of the inner loop.</p></li>
<li><p><strong>inner_optimization_step</strong> (<code class="xref py py-data docutils literal notranslate"><span class="pre">Callable</span></code>, default: <code class="docutils literal notranslate"><span class="pre">&lt;function</span> <span class="pre">gradient_descent_update</span> <span class="pre">at</span> <span class="pre">0x7ffbbced48b0&gt;</span></code>) – The function used for the inner loop optimization.
<li><p><strong>inner_optimization_step</strong> (<code class="xref py py-data docutils literal notranslate"><span class="pre">Callable</span></code>, default: <code class="docutils literal notranslate"><span class="pre">&lt;function</span> <span class="pre">gradient_descent_update</span> <span class="pre">at</span> <span class="pre">0x7fd7f95ac8b0&gt;</span></code>) – The function used for the inner loop optimization.
Default is ivy.gradient_descent_update.</p></li>
<li><p><strong>inner_batch_fn</strong> (<code class="xref py py-data docutils literal notranslate"><span class="pre">Optional</span></code>[<code class="xref py py-data docutils literal notranslate"><span class="pre">Callable</span></code>], default: <code class="docutils literal notranslate"><span class="pre">None</span></code>) – Function to apply to the task sub-batch, before passing to the inner_cost_fn.
Default is <code class="docutils literal notranslate"><span class="pre">None</span></code>.</p></li>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -1337,7 +1337,7 @@ <h1>maml_step<a class="headerlink" href="#maml-step" title="Link to this heading
<li><p><strong>variables</strong> (<a class="reference internal" href="../../../data_classes/data_classes/ivy.data_classes.container.html#ivy.data_classes.container.container.Container" title="ivy.data_classes.container.container.Container"><code class="xref py py-class docutils literal notranslate"><span class="pre">Container</span></code></a>) – Variables to be optimized during the meta step</p></li>
<li><p><strong>inner_grad_steps</strong> (<code class="xref py py-class docutils literal notranslate"><span class="pre">int</span></code>) – Number of gradient steps to perform during the inner loop.</p></li>
<li><p><strong>inner_learning_rate</strong> (<code class="xref py py-class docutils literal notranslate"><span class="pre">float</span></code>) – The learning rate of the inner loop.</p></li>
<li><p><strong>inner_optimization_step</strong> (<code class="xref py py-data docutils literal notranslate"><span class="pre">Callable</span></code>, default: <code class="docutils literal notranslate"><span class="pre">&lt;function</span> <span class="pre">gradient_descent_update</span> <span class="pre">at</span> <span class="pre">0x7ffbbced48b0&gt;</span></code>) – The function used for the inner loop optimization.
<li><p><strong>inner_optimization_step</strong> (<code class="xref py py-data docutils literal notranslate"><span class="pre">Callable</span></code>, default: <code class="docutils literal notranslate"><span class="pre">&lt;function</span> <span class="pre">gradient_descent_update</span> <span class="pre">at</span> <span class="pre">0x7fd7f95ac8b0&gt;</span></code>) – The function used for the inner loop optimization.
Default is ivy.gradient_descent_update.</p></li>
<li><p><strong>inner_batch_fn</strong> (<code class="xref py py-data docutils literal notranslate"><span class="pre">Optional</span></code>[<code class="xref py py-data docutils literal notranslate"><span class="pre">Callable</span></code>], default: <code class="docutils literal notranslate"><span class="pre">None</span></code>) – Function to apply to the task sub-batch, before passing to the inner_cost_fn.
Default is <code class="docutils literal notranslate"><span class="pre">None</span></code>.</p></li>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -1334,7 +1334,7 @@ <h1>reptile_step<a class="headerlink" href="#reptile-step" title="Link to this h
<li><p><strong>variables</strong> (<a class="reference internal" href="../../../data_classes/data_classes/ivy.data_classes.container.html#ivy.data_classes.container.container.Container" title="ivy.data_classes.container.container.Container"><code class="xref py py-class docutils literal notranslate"><span class="pre">Container</span></code></a>) – Variables to be optimized.</p></li>
<li><p><strong>inner_grad_steps</strong> (<code class="xref py py-class docutils literal notranslate"><span class="pre">int</span></code>) – Number of gradient steps to perform during the inner loop.</p></li>
<li><p><strong>inner_learning_rate</strong> (<code class="xref py py-class docutils literal notranslate"><span class="pre">float</span></code>) – The learning rate of the inner loop.</p></li>
<li><p><strong>inner_optimization_step</strong> (<code class="xref py py-data docutils literal notranslate"><span class="pre">Callable</span></code>, default: <code class="docutils literal notranslate"><span class="pre">&lt;function</span> <span class="pre">gradient_descent_update</span> <span class="pre">at</span> <span class="pre">0x7ffbbced48b0&gt;</span></code>) – The function used for the inner loop optimization. It takes the learnable
<li><p><strong>inner_optimization_step</strong> (<code class="xref py py-data docutils literal notranslate"><span class="pre">Callable</span></code>, default: <code class="docutils literal notranslate"><span class="pre">&lt;function</span> <span class="pre">gradient_descent_update</span> <span class="pre">at</span> <span class="pre">0x7fd7f95ac8b0&gt;</span></code>) – The function used for the inner loop optimization. It takes the learnable
weights,the derivative of the cost with respect to the weights, and the learning
rate as arguments, and returns the updated variables.
Default is <cite>gradient_descent_update</cite>.</p></li>
Expand Down
2 changes: 1 addition & 1 deletion docs/helpers/ivy_tests.test_ivy.helpers.globals.html
Original file line number Diff line number Diff line change
Expand Up @@ -1323,7 +1323,7 @@
<p>Should not be used inside any of the test functions.</p>
<dl class="py data">
<dt class="sig sig-object py" id="ivy_tests.test_ivy.helpers.globals.CURRENT_FRONTEND_CONFIG">
<span class="sig-prename descclassname"><span class="pre">ivy_tests.test_ivy.helpers.globals.</span></span><span class="sig-name descname"><span class="pre">CURRENT_FRONTEND_CONFIG</span></span><em class="property"><span class="p"><span class="pre">:</span></span><span class="w"> </span><span class="pre">&lt;object</span> <span class="pre">object</span> <span class="pre">at</span> <span class="pre">0x7ffbafe0dde0&gt;</span></em><a class="headerlink" href="#ivy_tests.test_ivy.helpers.globals.CURRENT_FRONTEND_CONFIG" title="Link to this definition">#</a></dt>
<span class="sig-prename descclassname"><span class="pre">ivy_tests.test_ivy.helpers.globals.</span></span><span class="sig-name descname"><span class="pre">CURRENT_FRONTEND_CONFIG</span></span><em class="property"><span class="p"><span class="pre">:</span></span><span class="w"> </span><span class="pre">&lt;object</span> <span class="pre">object</span> <span class="pre">at</span> <span class="pre">0x7fd7ec551de0&gt;</span></em><a class="headerlink" href="#ivy_tests.test_ivy.helpers.globals.CURRENT_FRONTEND_CONFIG" title="Link to this definition">#</a></dt>
<dd></dd></dl>

<dl class="py exception">
Expand Down
Loading

0 comments on commit 9d297ae

Please sign in to comment.