Skip to content

Commit

Permalink
Generate Python docs from pytorch/pytorch@12ea12d
Browse files Browse the repository at this point in the history
  • Loading branch information
pytorchbot committed Jul 23, 2023
1 parent 0680f60 commit 8423273
Show file tree
Hide file tree
Showing 41 changed files with 60 additions and 51 deletions.
2 changes: 1 addition & 1 deletion nightly/_modules/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -215,7 +215,7 @@
<div class="pytorch-left-menu-search">

<div class="version">
<a href='https://pytorch.org/functorch/versions.html'>nightly (2.1.0a0+gita6b8c30) &#x25BC</a>
<a href='https://pytorch.org/functorch/versions.html'>nightly (2.1.0a0+git12ea12d) &#x25BC</a>
</div>


Expand Down
2 changes: 1 addition & 1 deletion nightly/_modules/torch/_functorch/aot_autograd.html
Original file line number Diff line number Diff line change
Expand Up @@ -215,7 +215,7 @@
<div class="pytorch-left-menu-search">

<div class="version">
<a href='https://pytorch.org/functorch/versions.html'>nightly (2.1.0a0+gita6b8c30) &#x25BC</a>
<a href='https://pytorch.org/functorch/versions.html'>nightly (2.1.0a0+git12ea12d) &#x25BC</a>
</div>


Expand Down
2 changes: 1 addition & 1 deletion nightly/_modules/torch/_functorch/compilers.html
Original file line number Diff line number Diff line change
Expand Up @@ -215,7 +215,7 @@
<div class="pytorch-left-menu-search">

<div class="version">
<a href='https://pytorch.org/functorch/versions.html'>nightly (2.1.0a0+gita6b8c30) &#x25BC</a>
<a href='https://pytorch.org/functorch/versions.html'>nightly (2.1.0a0+git12ea12d) &#x25BC</a>
</div>


Expand Down
2 changes: 1 addition & 1 deletion nightly/_modules/torch/_functorch/deprecated.html
Original file line number Diff line number Diff line change
Expand Up @@ -215,7 +215,7 @@
<div class="pytorch-left-menu-search">

<div class="version">
<a href='https://pytorch.org/functorch/versions.html'>nightly (2.1.0a0+gita6b8c30) &#x25BC</a>
<a href='https://pytorch.org/functorch/versions.html'>nightly (2.1.0a0+git12ea12d) &#x25BC</a>
</div>


Expand Down
33 changes: 21 additions & 12 deletions nightly/_modules/torch/_functorch/partitioners.html
Original file line number Diff line number Diff line change
Expand Up @@ -215,7 +215,7 @@
<div class="pytorch-left-menu-search">

<div class="version">
<a href='https://pytorch.org/functorch/versions.html'>nightly (2.1.0a0+gita6b8c30) &#x25BC</a>
<a href='https://pytorch.org/functorch/versions.html'>nightly (2.1.0a0+git12ea12d) &#x25BC</a>
</div>


Expand Down Expand Up @@ -466,7 +466,7 @@ <h1>Source code for torch._functorch.partitioners</h1><div class="highlight"><pr
<span class="k">return</span> <span class="n">fwd_outputs</span><span class="p">,</span> <span class="n">bwd_outputs</span>


<span class="k">def</span> <span class="nf">_extract_fwd_bwd_modules</span><span class="p">(</span><span class="n">joint_module</span><span class="p">:</span> <span class="n">fx</span><span class="o">.</span><span class="n">GraphModule</span><span class="p">,</span> <span class="n">saved_values</span><span class="p">,</span> <span class="n">saved_sym_nodes</span><span class="o">=</span><span class="p">(),</span> <span class="o">*</span><span class="p">,</span> <span class="n">num_fwd_outputs</span><span class="p">):</span>
<span class="k">def</span> <span class="nf">_extract_fwd_bwd_modules</span><span class="p">(</span><span class="n">joint_module</span><span class="p">:</span> <span class="n">fx</span><span class="o">.</span><span class="n">GraphModule</span><span class="p">,</span> <span class="n">saved_values</span><span class="p">,</span> <span class="n">saved_sym_nodes</span><span class="p">,</span> <span class="o">*</span><span class="p">,</span> <span class="n">num_fwd_outputs</span><span class="p">):</span>
<span class="n">fwd_outputs</span><span class="p">,</span> <span class="n">bwd_outputs</span> <span class="o">=</span> <span class="n">_extract_fwd_bwd_outputs</span><span class="p">(</span><span class="n">joint_module</span><span class="p">,</span> <span class="n">num_fwd_outputs</span><span class="o">=</span><span class="n">num_fwd_outputs</span><span class="p">)</span>
<span class="n">primal_inputs</span> <span class="o">=</span> <span class="nb">list</span><span class="p">(</span><span class="nb">filter</span><span class="p">(</span><span class="n">_is_primal</span><span class="p">,</span> <span class="n">joint_module</span><span class="o">.</span><span class="n">graph</span><span class="o">.</span><span class="n">nodes</span><span class="p">))</span>
<span class="n">tangent_inputs</span> <span class="o">=</span> <span class="nb">list</span><span class="p">(</span><span class="nb">filter</span><span class="p">(</span><span class="n">_is_tangent</span><span class="p">,</span> <span class="n">joint_module</span><span class="o">.</span><span class="n">graph</span><span class="o">.</span><span class="n">nodes</span><span class="p">))</span>
Expand Down Expand Up @@ -535,9 +535,11 @@ <h1>Source code for torch._functorch.partitioners</h1><div class="highlight"><pr
<span class="n">saved_symbols</span> <span class="o">|=</span> <span class="n">new_symbols</span>


<span class="c1"># Update saved_sym_nodes that are now reordered to have all bindings</span>
<span class="c1"># at front</span>
<span class="n">saved_sym_nodes</span> <span class="o">=</span> <span class="n">saved_sym_nodes_binding</span> <span class="o">+</span> <span class="n">saved_sym_nodes_derived</span>
<span class="c1"># Update saved_sym_nodes that are now reordered to have all bindings at</span>
<span class="c1"># front. This can also be used later on to figure out the position of saved</span>
<span class="c1"># sym nodes in the output of fwd graph.</span>
<span class="n">saved_sym_nodes</span><span class="o">.</span><span class="n">clear</span><span class="p">()</span>
<span class="n">saved_sym_nodes</span><span class="o">.</span><span class="n">extend</span><span class="p">(</span><span class="n">saved_sym_nodes_binding</span> <span class="o">+</span> <span class="n">saved_sym_nodes_derived</span><span class="p">)</span>

<span class="c1"># Now, we re-generate the fwd/bwd graphs.</span>
<span class="c1"># NB: This might increase compilation time, but I doubt it matters</span>
Expand Down Expand Up @@ -816,7 +818,7 @@ <h1>Source code for torch._functorch.partitioners</h1><div class="highlight"><pr
<span class="k">return</span> <span class="n">new_gm</span>


<span class="k">def</span> <span class="nf">functionalize_rng_ops</span><span class="p">(</span><span class="n">joint_module</span><span class="p">,</span> <span class="n">fw_module</span><span class="p">,</span> <span class="n">bw_module</span><span class="p">):</span>
<span class="k">def</span> <span class="nf">functionalize_rng_ops</span><span class="p">(</span><span class="n">joint_module</span><span class="p">,</span> <span class="n">fw_module</span><span class="p">,</span> <span class="n">bw_module</span><span class="p">,</span> <span class="n">num_sym_nodes</span><span class="p">):</span>
<span class="c1"># During user-driven activation checkpointing, we have to ensure that a rng</span>
<span class="c1"># op in fwd yields the same output as the recomputed rng op in the bwd. To</span>
<span class="c1"># do this, we use functionalize wrappers to wrap the random ops and share</span>
Expand All @@ -827,7 +829,9 @@ <h1>Source code for torch._functorch.partitioners</h1><div class="highlight"><pr
<span class="c1"># Step 2 - Modify the fwd pass such that</span>
<span class="c1"># 1) Replace rand with run_and_save_rng_state wrapper</span>
<span class="c1"># 2) Replace the users of the original op with the output[1] of this op.</span>
<span class="c1"># 3) Collect all the rng_state - output[0] of each op, and make them output nodes.</span>
<span class="c1"># 3) Collect all the rng_state - output[0] of each op, and make them</span>
<span class="c1"># output nodes. Special care needs to be taken here because fwd outputs</span>
<span class="c1"># has symints at the very end.</span>
<span class="c1"># Step 3 - Modify the bwd pass such that</span>
<span class="c1"># 1) Add the input nodes just before the tangents for the stashed rng states</span>
<span class="c1"># 2) Replace rand with run_with_save_rng_state wrappers</span>
Expand Down Expand Up @@ -910,11 +914,15 @@ <h1>Source code for torch._functorch.partitioners</h1><div class="highlight"><pr
<span class="n">bw_graph</span><span class="o">.</span><span class="n">erase_node</span><span class="p">(</span><span class="n">bw_node</span><span class="p">)</span>


<span class="c1"># Add the rng states in the output of the fwd graph</span>
<span class="n">fw_output</span> <span class="o">=</span> <span class="p">[</span><span class="n">node</span> <span class="k">for</span> <span class="n">node</span> <span class="ow">in</span> <span class="n">fw_module</span><span class="o">.</span><span class="n">graph</span><span class="o">.</span><span class="n">nodes</span> <span class="k">if</span> <span class="n">node</span><span class="o">.</span><span class="n">op</span> <span class="o">==</span> <span class="s2">&quot;output&quot;</span><span class="p">][</span><span class="mi">0</span><span class="p">]</span>
<span class="n">outputs</span> <span class="o">=</span> <span class="n">fw_output</span><span class="o">.</span><span class="n">args</span><span class="p">[</span><span class="mi">0</span><span class="p">]</span> <span class="o">+</span> <span class="n">fw_rng_state_outputs</span>
<span class="c1"># Add the rng states in the output of the fwd graph. AOT Autograd assumes</span>
<span class="c1"># that symints are at the end of forward graph outputs. So, insert the new</span>
<span class="c1"># rng states accordingly.</span>
<span class="n">fw_output_node</span> <span class="o">=</span> <span class="p">[</span><span class="n">node</span> <span class="k">for</span> <span class="n">node</span> <span class="ow">in</span> <span class="n">fw_module</span><span class="o">.</span><span class="n">graph</span><span class="o">.</span><span class="n">nodes</span> <span class="k">if</span> <span class="n">node</span><span class="o">.</span><span class="n">op</span> <span class="o">==</span> <span class="s2">&quot;output&quot;</span><span class="p">][</span><span class="mi">0</span><span class="p">]</span>
<span class="n">fw_outputs</span> <span class="o">=</span> <span class="n">fw_output_node</span><span class="o">.</span><span class="n">args</span><span class="p">[</span><span class="mi">0</span><span class="p">]</span>
<span class="n">sym_node_start_idx</span> <span class="o">=</span> <span class="nb">len</span><span class="p">(</span><span class="n">fw_outputs</span><span class="p">)</span> <span class="o">-</span> <span class="n">num_sym_nodes</span>
<span class="n">outputs</span> <span class="o">=</span> <span class="n">fw_outputs</span><span class="p">[:</span><span class="n">sym_node_start_idx</span><span class="p">]</span> <span class="o">+</span> <span class="n">fw_rng_state_outputs</span> <span class="o">+</span> <span class="n">fw_outputs</span><span class="p">[</span><span class="n">sym_node_start_idx</span><span class="p">:]</span>
<span class="n">fw_module</span><span class="o">.</span><span class="n">graph</span><span class="o">.</span><span class="n">output</span><span class="p">(</span><span class="n">outputs</span><span class="p">)</span>
<span class="n">fw_module</span><span class="o">.</span><span class="n">graph</span><span class="o">.</span><span class="n">erase_node</span><span class="p">(</span><span class="n">fw_output</span><span class="p">)</span>
<span class="n">fw_module</span><span class="o">.</span><span class="n">graph</span><span class="o">.</span><span class="n">erase_node</span><span class="p">(</span><span class="n">fw_output_node</span><span class="p">)</span>
<span class="n">fw_module</span><span class="o">.</span><span class="n">recompile</span><span class="p">()</span>
<span class="n">bw_module</span><span class="o">.</span><span class="n">recompile</span><span class="p">()</span>
<span class="k">return</span> <span class="n">fw_module</span><span class="p">,</span> <span class="n">bw_module</span>
Expand Down Expand Up @@ -1202,13 +1210,14 @@ <h1>Source code for torch._functorch.partitioners</h1><div class="highlight"><pr
<span class="c1"># save_for_backward on tensors and stashes symints in autograd .ctx</span>
<span class="n">saved_sym_nodes</span> <span class="o">=</span> <span class="nb">list</span><span class="p">(</span><span class="nb">filter</span><span class="p">(</span><span class="k">lambda</span> <span class="n">n</span><span class="p">:</span> <span class="n">is_sym_node</span><span class="p">(</span><span class="n">n</span><span class="p">),</span> <span class="n">saved_values</span><span class="p">))</span>
<span class="n">saved_values</span> <span class="o">=</span> <span class="nb">list</span><span class="p">(</span><span class="nb">filter</span><span class="p">(</span><span class="k">lambda</span> <span class="n">n</span><span class="p">:</span> <span class="ow">not</span> <span class="n">is_sym_node</span><span class="p">(</span><span class="n">n</span><span class="p">),</span> <span class="n">saved_values</span><span class="p">))</span>
<span class="c1"># NB: saved_sym_nodes will be mutated to reflect the actual saved symbols</span>
<span class="n">fw_module</span><span class="p">,</span> <span class="n">bw_module</span> <span class="o">=</span> <span class="n">_extract_fwd_bwd_modules</span><span class="p">(</span>
<span class="n">joint_module</span><span class="p">,</span> <span class="n">saved_values</span><span class="p">,</span> <span class="n">saved_sym_nodes</span><span class="o">=</span><span class="n">saved_sym_nodes</span><span class="p">,</span> <span class="n">num_fwd_outputs</span><span class="o">=</span><span class="n">num_fwd_outputs</span><span class="p">)</span>

<span class="k">if</span> <span class="n">graph_has_recomputable_ops</span><span class="p">:</span>
<span class="k">if</span> <span class="n">graph_has_recomputable_rng_ops</span><span class="p">:</span>
<span class="n">fw_module</span><span class="p">,</span> <span class="n">bw_module</span> <span class="o">=</span> <span class="n">functionalize_rng_ops</span><span class="p">(</span>
<span class="n">joint_module</span><span class="p">,</span> <span class="n">fw_module</span><span class="p">,</span> <span class="n">bw_module</span>
<span class="n">joint_module</span><span class="p">,</span> <span class="n">fw_module</span><span class="p">,</span> <span class="n">bw_module</span><span class="p">,</span> <span class="nb">len</span><span class="p">(</span><span class="n">saved_sym_nodes</span><span class="p">)</span>
<span class="p">)</span>
<span class="n">bw_module</span> <span class="o">=</span> <span class="n">reordering_to_mimic_autograd_engine</span><span class="p">(</span><span class="n">bw_module</span><span class="p">)</span>

Expand Down
2 changes: 1 addition & 1 deletion nightly/aot_autograd.html
Original file line number Diff line number Diff line change
Expand Up @@ -217,7 +217,7 @@
<div class="pytorch-left-menu-search">

<div class="version">
<a href='https://pytorch.org/functorch/versions.html'>nightly (2.1.0a0+gita6b8c30) &#x25BC</a>
<a href='https://pytorch.org/functorch/versions.html'>nightly (2.1.0a0+git12ea12d) &#x25BC</a>
</div>


Expand Down
2 changes: 1 addition & 1 deletion nightly/batch_norm.html
Original file line number Diff line number Diff line change
Expand Up @@ -217,7 +217,7 @@
<div class="pytorch-left-menu-search">

<div class="version">
<a href='https://pytorch.org/functorch/versions.html'>nightly (2.1.0a0+gita6b8c30) &#x25BC</a>
<a href='https://pytorch.org/functorch/versions.html'>nightly (2.1.0a0+git12ea12d) &#x25BC</a>
</div>


Expand Down
2 changes: 1 addition & 1 deletion nightly/experimental.html
Original file line number Diff line number Diff line change
Expand Up @@ -217,7 +217,7 @@
<div class="pytorch-left-menu-search">

<div class="version">
<a href='https://pytorch.org/functorch/versions.html'>nightly (2.1.0a0+gita6b8c30) &#x25BC</a>
<a href='https://pytorch.org/functorch/versions.html'>nightly (2.1.0a0+git12ea12d) &#x25BC</a>
</div>


Expand Down
2 changes: 1 addition & 1 deletion nightly/functorch.html
Original file line number Diff line number Diff line change
Expand Up @@ -217,7 +217,7 @@
<div class="pytorch-left-menu-search">

<div class="version">
<a href='https://pytorch.org/functorch/versions.html'>nightly (2.1.0a0+gita6b8c30) &#x25BC</a>
<a href='https://pytorch.org/functorch/versions.html'>nightly (2.1.0a0+git12ea12d) &#x25BC</a>
</div>


Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -217,7 +217,7 @@
<div class="pytorch-left-menu-search">

<div class="version">
<a href='https://pytorch.org/functorch/versions.html'>nightly (2.1.0a0+gita6b8c30) &#x25BC</a>
<a href='https://pytorch.org/functorch/versions.html'>nightly (2.1.0a0+git12ea12d) &#x25BC</a>
</div>


Expand Down
2 changes: 1 addition & 1 deletion nightly/generated/functorch.compile.aot_function.html
Original file line number Diff line number Diff line change
Expand Up @@ -217,7 +217,7 @@
<div class="pytorch-left-menu-search">

<div class="version">
<a href='https://pytorch.org/functorch/versions.html'>nightly (2.1.0a0+gita6b8c30) &#x25BC</a>
<a href='https://pytorch.org/functorch/versions.html'>nightly (2.1.0a0+git12ea12d) &#x25BC</a>
</div>


Expand Down
2 changes: 1 addition & 1 deletion nightly/generated/functorch.compile.aot_module.html
Original file line number Diff line number Diff line change
Expand Up @@ -217,7 +217,7 @@
<div class="pytorch-left-menu-search">

<div class="version">
<a href='https://pytorch.org/functorch/versions.html'>nightly (2.1.0a0+gita6b8c30) &#x25BC</a>
<a href='https://pytorch.org/functorch/versions.html'>nightly (2.1.0a0+git12ea12d) &#x25BC</a>
</div>


Expand Down
2 changes: 1 addition & 1 deletion nightly/generated/functorch.compile.default_partition.html
Original file line number Diff line number Diff line change
Expand Up @@ -217,7 +217,7 @@
<div class="pytorch-left-menu-search">

<div class="version">
<a href='https://pytorch.org/functorch/versions.html'>nightly (2.1.0a0+gita6b8c30) &#x25BC</a>
<a href='https://pytorch.org/functorch/versions.html'>nightly (2.1.0a0+git12ea12d) &#x25BC</a>
</div>


Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -217,7 +217,7 @@
<div class="pytorch-left-menu-search">

<div class="version">
<a href='https://pytorch.org/functorch/versions.html'>nightly (2.1.0a0+gita6b8c30) &#x25BC</a>
<a href='https://pytorch.org/functorch/versions.html'>nightly (2.1.0a0+git12ea12d) &#x25BC</a>
</div>


Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -217,7 +217,7 @@
<div class="pytorch-left-menu-search">

<div class="version">
<a href='https://pytorch.org/functorch/versions.html'>nightly (2.1.0a0+gita6b8c30) &#x25BC</a>
<a href='https://pytorch.org/functorch/versions.html'>nightly (2.1.0a0+git12ea12d) &#x25BC</a>
</div>


Expand Down
2 changes: 1 addition & 1 deletion nightly/generated/functorch.compile.nop.html
Original file line number Diff line number Diff line change
Expand Up @@ -217,7 +217,7 @@
<div class="pytorch-left-menu-search">

<div class="version">
<a href='https://pytorch.org/functorch/versions.html'>nightly (2.1.0a0+gita6b8c30) &#x25BC</a>
<a href='https://pytorch.org/functorch/versions.html'>nightly (2.1.0a0+git12ea12d) &#x25BC</a>
</div>


Expand Down
2 changes: 1 addition & 1 deletion nightly/generated/functorch.compile.ts_compile.html
Original file line number Diff line number Diff line change
Expand Up @@ -217,7 +217,7 @@
<div class="pytorch-left-menu-search">

<div class="version">
<a href='https://pytorch.org/functorch/versions.html'>nightly (2.1.0a0+gita6b8c30) &#x25BC</a>
<a href='https://pytorch.org/functorch/versions.html'>nightly (2.1.0a0+git12ea12d) &#x25BC</a>
</div>


Expand Down
2 changes: 1 addition & 1 deletion nightly/generated/functorch.functionalize.html
Original file line number Diff line number Diff line change
Expand Up @@ -217,7 +217,7 @@
<div class="pytorch-left-menu-search">

<div class="version">
<a href='https://pytorch.org/functorch/versions.html'>nightly (2.1.0a0+gita6b8c30) &#x25BC</a>
<a href='https://pytorch.org/functorch/versions.html'>nightly (2.1.0a0+git12ea12d) &#x25BC</a>
</div>


Expand Down
2 changes: 1 addition & 1 deletion nightly/generated/functorch.grad.html
Original file line number Diff line number Diff line change
Expand Up @@ -217,7 +217,7 @@
<div class="pytorch-left-menu-search">

<div class="version">
<a href='https://pytorch.org/functorch/versions.html'>nightly (2.1.0a0+gita6b8c30) &#x25BC</a>
<a href='https://pytorch.org/functorch/versions.html'>nightly (2.1.0a0+git12ea12d) &#x25BC</a>
</div>


Expand Down
2 changes: 1 addition & 1 deletion nightly/generated/functorch.grad_and_value.html
Original file line number Diff line number Diff line change
Expand Up @@ -217,7 +217,7 @@
<div class="pytorch-left-menu-search">

<div class="version">
<a href='https://pytorch.org/functorch/versions.html'>nightly (2.1.0a0+gita6b8c30) &#x25BC</a>
<a href='https://pytorch.org/functorch/versions.html'>nightly (2.1.0a0+git12ea12d) &#x25BC</a>
</div>


Expand Down
2 changes: 1 addition & 1 deletion nightly/generated/functorch.hessian.html
Original file line number Diff line number Diff line change
Expand Up @@ -217,7 +217,7 @@
<div class="pytorch-left-menu-search">

<div class="version">
<a href='https://pytorch.org/functorch/versions.html'>nightly (2.1.0a0+gita6b8c30) &#x25BC</a>
<a href='https://pytorch.org/functorch/versions.html'>nightly (2.1.0a0+git12ea12d) &#x25BC</a>
</div>


Expand Down
2 changes: 1 addition & 1 deletion nightly/generated/functorch.jacfwd.html
Original file line number Diff line number Diff line change
Expand Up @@ -217,7 +217,7 @@
<div class="pytorch-left-menu-search">

<div class="version">
<a href='https://pytorch.org/functorch/versions.html'>nightly (2.1.0a0+gita6b8c30) &#x25BC</a>
<a href='https://pytorch.org/functorch/versions.html'>nightly (2.1.0a0+git12ea12d) &#x25BC</a>
</div>


Expand Down
Loading

0 comments on commit 8423273

Please sign in to comment.