Skip to content

Commit 0fb10a7

Browse files
committed
Generate Python docs from pytorch/pytorch@a21f2ab
1 parent 3e72b2d commit 0fb10a7

File tree

1,732 files changed

+2455
-2398
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

1,732 files changed

+2455
-2398
lines changed

docs/master/__config__.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -194,7 +194,7 @@
194194
<div class="pytorch-left-menu-search">
195195

196196
<div class="version">
197-
<a href='https://pytorch.org/docs/versions.html'>master (1.11.0a0+git125a559 ) &#x25BC</a>
197+
<a href='https://pytorch.org/docs/versions.html'>master (1.11.0a0+gita21f2ab ) &#x25BC</a>
198198
</div>
199199

200200

docs/master/_images/RReLU.png

-94 Bytes
Loading

docs/master/_modules/index.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -193,7 +193,7 @@
193193
<div class="pytorch-left-menu-search">
194194

195195
<div class="version">
196-
<a href='https://pytorch.org/docs/versions.html'>master (1.11.0a0+git125a559 ) &#x25BC</a>
196+
<a href='https://pytorch.org/docs/versions.html'>master (1.11.0a0+gita21f2ab ) &#x25BC</a>
197197
</div>
198198

199199

docs/master/_modules/torch.html

Lines changed: 41 additions & 41 deletions
Large diffs are not rendered by default.

docs/master/_modules/torch/__config__.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -193,7 +193,7 @@
193193
<div class="pytorch-left-menu-search">
194194

195195
<div class="version">
196-
<a href='https://pytorch.org/docs/versions.html'>master (1.11.0a0+git125a559 ) &#x25BC</a>
196+
<a href='https://pytorch.org/docs/versions.html'>master (1.11.0a0+gita21f2ab ) &#x25BC</a>
197197
</div>
198198

199199

docs/master/_modules/torch/_jit_internal.html

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -193,7 +193,7 @@
193193
<div class="pytorch-left-menu-search">
194194

195195
<div class="version">
196-
<a href='https://pytorch.org/docs/versions.html'>master (1.11.0a0+git125a559 ) &#x25BC</a>
196+
<a href='https://pytorch.org/docs/versions.html'>master (1.11.0a0+gita21f2ab ) &#x25BC</a>
197197
</div>
198198

199199

@@ -900,7 +900,7 @@ <h1>Source code for torch._jit_internal</h1><div class="highlight"><pre>
900900
<span class="k">return</span> <span class="n">fn</span></div>
901901

902902

903-
<div class="viewcode-block" id="unused"><a class="viewcode-back" href="../../generated/torch.jit.unused.html#torch.jit.unused">[docs]</a><span class="k">def</span> <span class="nf">unused</span><span class="p">(</span><span class="n">fn</span><span class="p">):</span>
903+
<span class="k">def</span> <span class="nf">unused</span><span class="p">(</span><span class="n">fn</span><span class="p">):</span>
904904
<span class="sd">&quot;&quot;&quot;</span>
905905
<span class="sd"> This decorator indicates to the compiler that a function or method should</span>
906906
<span class="sd"> be ignored and replaced with the raising of an exception. This allows you</span>
@@ -947,7 +947,7 @@ <h1>Source code for torch._jit_internal</h1><div class="highlight"><pre>
947947
<span class="k">return</span> <span class="n">prop</span>
948948

949949
<span class="n">fn</span><span class="o">.</span><span class="n">_torchscript_modifier</span> <span class="o">=</span> <span class="n">FunctionModifiers</span><span class="o">.</span><span class="n">UNUSED</span>
950-
<span class="k">return</span> <span class="n">fn</span></div>
950+
<span class="k">return</span> <span class="n">fn</span>
951951

952952
<span class="c1"># No op context manager from python side</span>
953953
<span class="k">class</span> <span class="nc">_IgnoreContextManager</span><span class="p">(</span><span class="n">contextlib</span><span class="o">.</span><span class="n">AbstractContextManager</span><span class="p">):</span>

docs/master/_modules/torch/_lobpcg.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -193,7 +193,7 @@
193193
<div class="pytorch-left-menu-search">
194194

195195
<div class="version">
196-
<a href='https://pytorch.org/docs/versions.html'>master (1.11.0a0+git125a559 ) &#x25BC</a>
196+
<a href='https://pytorch.org/docs/versions.html'>master (1.11.0a0+gita21f2ab ) &#x25BC</a>
197197
</div>
198198

199199

docs/master/_modules/torch/_lowrank.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -193,7 +193,7 @@
193193
<div class="pytorch-left-menu-search">
194194

195195
<div class="version">
196-
<a href='https://pytorch.org/docs/versions.html'>master (1.11.0a0+git125a559 ) &#x25BC</a>
196+
<a href='https://pytorch.org/docs/versions.html'>master (1.11.0a0+gita21f2ab ) &#x25BC</a>
197197
</div>
198198

199199

docs/master/_modules/torch/_tensor.html

Lines changed: 17 additions & 17 deletions
Large diffs are not rendered by default.

docs/master/_modules/torch/_tensor_str.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -193,7 +193,7 @@
193193
<div class="pytorch-left-menu-search">
194194

195195
<div class="version">
196-
<a href='https://pytorch.org/docs/versions.html'>master (1.11.0a0+git125a559 ) &#x25BC</a>
196+
<a href='https://pytorch.org/docs/versions.html'>master (1.11.0a0+gita21f2ab ) &#x25BC</a>
197197
</div>
198198

199199

docs/master/_modules/torch/_utils.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -193,7 +193,7 @@
193193
<div class="pytorch-left-menu-search">
194194

195195
<div class="version">
196-
<a href='https://pytorch.org/docs/versions.html'>master (1.11.0a0+git125a559 ) &#x25BC</a>
196+
<a href='https://pytorch.org/docs/versions.html'>master (1.11.0a0+gita21f2ab ) &#x25BC</a>
197197
</div>
198198

199199

docs/master/_modules/torch/_vmap_internals.html

Lines changed: 18 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -193,7 +193,7 @@
193193
<div class="pytorch-left-menu-search">
194194

195195
<div class="version">
196-
<a href='https://pytorch.org/docs/versions.html'>master (1.11.0a0+git125a559 ) &#x25BC</a>
196+
<a href='https://pytorch.org/docs/versions.html'>master (1.11.0a0+gita21f2ab ) &#x25BC</a>
197197
</div>
198198

199199

@@ -479,8 +479,8 @@ <h1>Source code for torch._vmap_internals</h1><div class="highlight"><pre>
479479
<span class="c1"># Undos the batching (and any batch dimensions) associated with the `vmap_level`.</span>
480480
<span class="k">def</span> <span class="nf">_unwrap_batched</span><span class="p">(</span>
481481
<span class="n">batched_outputs</span><span class="p">:</span> <span class="n">Union</span><span class="p">[</span><span class="n">Tensor</span><span class="p">,</span> <span class="n">Tuple</span><span class="p">[</span><span class="n">Tensor</span><span class="p">,</span> <span class="o">...</span><span class="p">]],</span>
482-
<span class="n">out_dims</span><span class="p">:</span> <span class="n">out_dims_t</span><span class="p">,</span>
483-
<span class="n">vmap_level</span><span class="p">:</span> <span class="nb">int</span><span class="p">,</span> <span class="n">batch_size</span><span class="p">:</span> <span class="nb">int</span><span class="p">,</span> <span class="n">func</span><span class="p">:</span> <span class="n">Callable</span><span class="p">)</span> <span class="o">-&gt;</span> <span class="n">Tuple</span><span class="p">:</span>
482+
<span class="n">out_dims</span><span class="p">:</span> <span class="n">out_dims_t</span><span class="p">,</span> <span class="n">vmap_level</span><span class="p">:</span> <span class="nb">int</span><span class="p">,</span> <span class="n">batch_size</span><span class="p">:</span> <span class="nb">int</span><span class="p">,</span> <span class="n">func</span><span class="p">:</span> <span class="n">Callable</span><span class="p">,</span>
483+
<span class="n">allow_none_pass_through</span><span class="p">:</span> <span class="nb">bool</span> <span class="o">=</span> <span class="kc">False</span><span class="p">)</span> <span class="o">-&gt;</span> <span class="n">Tuple</span><span class="p">:</span>
484484
<span class="n">num_outputs</span> <span class="o">=</span> <span class="n">_num_outputs</span><span class="p">(</span><span class="n">batched_outputs</span><span class="p">)</span>
485485
<span class="n">out_dims_as_tuple</span> <span class="o">=</span> <span class="n">_as_tuple</span><span class="p">(</span>
486486
<span class="n">out_dims</span><span class="p">,</span> <span class="n">num_outputs</span><span class="p">,</span>
@@ -493,8 +493,12 @@ <h1>Source code for torch._vmap_internals</h1><div class="highlight"><pre>
493493
<span class="k">if</span> <span class="nb">isinstance</span><span class="p">(</span><span class="n">batched_outputs</span><span class="p">,</span> <span class="n">Tensor</span><span class="p">):</span>
494494
<span class="n">out_dim</span> <span class="o">=</span> <span class="n">out_dims_as_tuple</span><span class="p">[</span><span class="mi">0</span><span class="p">]</span>
495495
<span class="k">return</span> <span class="n">torch</span><span class="o">.</span><span class="n">_remove_batch_dim</span><span class="p">(</span><span class="n">batched_outputs</span><span class="p">,</span> <span class="n">vmap_level</span><span class="p">,</span> <span class="n">batch_size</span><span class="p">,</span> <span class="n">out_dim</span><span class="p">)</span> <span class="c1"># type: ignore[return-value]</span>
496-
<span class="k">return</span> <span class="nb">tuple</span><span class="p">(</span><span class="n">torch</span><span class="o">.</span><span class="n">_remove_batch_dim</span><span class="p">(</span><span class="n">out</span><span class="p">,</span> <span class="n">vmap_level</span><span class="p">,</span> <span class="n">batch_size</span><span class="p">,</span> <span class="n">out_dim</span><span class="p">)</span>
497-
<span class="k">for</span> <span class="n">out</span><span class="p">,</span> <span class="n">out_dim</span> <span class="ow">in</span> <span class="nb">zip</span><span class="p">(</span><span class="n">batched_outputs</span><span class="p">,</span> <span class="n">out_dims_as_tuple</span><span class="p">))</span>
496+
<span class="k">if</span> <span class="n">allow_none_pass_through</span><span class="p">:</span>
497+
<span class="k">return</span> <span class="nb">tuple</span><span class="p">((</span><span class="n">torch</span><span class="o">.</span><span class="n">_remove_batch_dim</span><span class="p">(</span><span class="n">out</span><span class="p">,</span> <span class="n">vmap_level</span><span class="p">,</span> <span class="n">batch_size</span><span class="p">,</span> <span class="n">out_dim</span><span class="p">)</span> <span class="k">if</span> <span class="n">out</span> <span class="ow">is</span> <span class="ow">not</span> <span class="kc">None</span> <span class="k">else</span> <span class="kc">None</span><span class="p">)</span>
498+
<span class="k">for</span> <span class="n">out</span><span class="p">,</span> <span class="n">out_dim</span> <span class="ow">in</span> <span class="nb">zip</span><span class="p">(</span><span class="n">batched_outputs</span><span class="p">,</span> <span class="n">out_dims_as_tuple</span><span class="p">))</span>
499+
<span class="k">else</span><span class="p">:</span>
500+
<span class="k">return</span> <span class="nb">tuple</span><span class="p">(</span><span class="n">torch</span><span class="o">.</span><span class="n">_remove_batch_dim</span><span class="p">(</span><span class="n">out</span><span class="p">,</span> <span class="n">vmap_level</span><span class="p">,</span> <span class="n">batch_size</span><span class="p">,</span> <span class="n">out_dim</span><span class="p">)</span>
501+
<span class="k">for</span> <span class="n">out</span><span class="p">,</span> <span class="n">out_dim</span> <span class="ow">in</span> <span class="nb">zip</span><span class="p">(</span><span class="n">batched_outputs</span><span class="p">,</span> <span class="n">out_dims_as_tuple</span><span class="p">))</span>
498502

499503
<span class="c1"># Checks that `fn` returned one or more Tensors and nothing else.</span>
500504
<span class="c1"># NB: A python function that return multiple arguments returns a single tuple,</span>
@@ -645,16 +649,22 @@ <h1>Source code for torch._vmap_internals</h1><div class="highlight"><pre>
645649
<span class="k">return</span> <span class="n">_vmap</span><span class="p">(</span><span class="n">func</span><span class="p">,</span> <span class="n">in_dims</span><span class="p">,</span> <span class="n">out_dims</span><span class="p">)</span></div>
646650

647651
<span class="c1"># A version of vmap but without the initial &quot;experimental prototype&quot; warning</span>
648-
<span class="k">def</span> <span class="nf">_vmap</span><span class="p">(</span><span class="n">func</span><span class="p">:</span> <span class="n">Callable</span><span class="p">,</span> <span class="n">in_dims</span><span class="p">:</span> <span class="n">in_dims_t</span> <span class="o">=</span> <span class="mi">0</span><span class="p">,</span> <span class="n">out_dims</span><span class="p">:</span> <span class="n">out_dims_t</span> <span class="o">=</span> <span class="mi">0</span><span class="p">)</span> <span class="o">-&gt;</span> <span class="n">Callable</span><span class="p">:</span>
652+
<span class="k">def</span> <span class="nf">_vmap</span><span class="p">(</span><span class="n">func</span><span class="p">:</span> <span class="n">Callable</span><span class="p">,</span> <span class="n">in_dims</span><span class="p">:</span> <span class="n">in_dims_t</span> <span class="o">=</span> <span class="mi">0</span><span class="p">,</span> <span class="n">out_dims</span><span class="p">:</span> <span class="n">out_dims_t</span> <span class="o">=</span> <span class="mi">0</span><span class="p">,</span> <span class="n">allow_none_pass_through</span><span class="p">:</span> <span class="nb">bool</span> <span class="o">=</span> <span class="kc">False</span><span class="p">)</span> <span class="o">-&gt;</span> <span class="n">Callable</span><span class="p">:</span>
653+
<span class="c1"># The `allow_none_pass_through` argument is a temporary workaround may be removed.</span>
654+
<span class="c1"># Currently it enables us to wrap the call in `autograd.grad` to the autograd engine,</span>
655+
<span class="c1"># which may return None if any of the inputs are unused. See the issue discussing this:</span>
656+
<span class="c1"># https://github.com/facebookresearch/functorch/issues/159.</span>
649657
<span class="nd">@functools</span><span class="o">.</span><span class="n">wraps</span><span class="p">(</span><span class="n">func</span><span class="p">)</span>
650658
<span class="k">def</span> <span class="nf">wrapped</span><span class="p">(</span><span class="o">*</span><span class="n">args</span><span class="p">):</span>
651659
<span class="n">_check_out_dims_is_int_or_int_tuple</span><span class="p">(</span><span class="n">out_dims</span><span class="p">,</span> <span class="n">func</span><span class="p">)</span>
652660
<span class="n">vmap_level</span> <span class="o">=</span> <span class="n">torch</span><span class="o">.</span><span class="n">_C</span><span class="o">.</span><span class="n">_vmapmode_increment_nesting</span><span class="p">()</span>
653661
<span class="k">try</span><span class="p">:</span>
654662
<span class="n">batched_inputs</span><span class="p">,</span> <span class="n">batch_size</span> <span class="o">=</span> <span class="n">_create_batched_inputs</span><span class="p">(</span><span class="n">in_dims</span><span class="p">,</span> <span class="n">args</span><span class="p">,</span> <span class="n">vmap_level</span><span class="p">,</span> <span class="n">func</span><span class="p">)</span>
655663
<span class="n">batched_outputs</span> <span class="o">=</span> <span class="n">func</span><span class="p">(</span><span class="o">*</span><span class="n">batched_inputs</span><span class="p">)</span>
656-
<span class="n">_validate_outputs</span><span class="p">(</span><span class="n">batched_outputs</span><span class="p">,</span> <span class="n">func</span><span class="p">)</span>
657-
<span class="k">return</span> <span class="n">_unwrap_batched</span><span class="p">(</span><span class="n">batched_outputs</span><span class="p">,</span> <span class="n">out_dims</span><span class="p">,</span> <span class="n">vmap_level</span><span class="p">,</span> <span class="n">batch_size</span><span class="p">,</span> <span class="n">func</span><span class="p">)</span>
664+
<span class="k">if</span> <span class="ow">not</span> <span class="n">allow_none_pass_through</span><span class="p">:</span>
665+
<span class="n">_validate_outputs</span><span class="p">(</span><span class="n">batched_outputs</span><span class="p">,</span> <span class="n">func</span><span class="p">)</span>
666+
<span class="k">return</span> <span class="n">_unwrap_batched</span><span class="p">(</span><span class="n">batched_outputs</span><span class="p">,</span> <span class="n">out_dims</span><span class="p">,</span> <span class="n">vmap_level</span><span class="p">,</span> <span class="n">batch_size</span><span class="p">,</span> <span class="n">func</span><span class="p">,</span>
667+
<span class="n">allow_none_pass_through</span><span class="o">=</span><span class="n">allow_none_pass_through</span><span class="p">)</span>
658668
<span class="k">finally</span><span class="p">:</span>
659669
<span class="n">torch</span><span class="o">.</span><span class="n">_C</span><span class="o">.</span><span class="n">_vmapmode_decrement_nesting</span><span class="p">()</span>
660670
<span class="k">return</span> <span class="n">wrapped</span>

docs/master/_modules/torch/ao/quantization/fake_quantize.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -193,7 +193,7 @@
193193
<div class="pytorch-left-menu-search">
194194

195195
<div class="version">
196-
<a href='https://pytorch.org/docs/versions.html'>master (1.11.0a0+git125a559 ) &#x25BC</a>
196+
<a href='https://pytorch.org/docs/versions.html'>master (1.11.0a0+gita21f2ab ) &#x25BC</a>
197197
</div>
198198

199199

docs/master/_modules/torch/ao/quantization/fuse_modules.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -193,7 +193,7 @@
193193
<div class="pytorch-left-menu-search">
194194

195195
<div class="version">
196-
<a href='https://pytorch.org/docs/versions.html'>master (1.11.0a0+git125a559 ) &#x25BC</a>
196+
<a href='https://pytorch.org/docs/versions.html'>master (1.11.0a0+gita21f2ab ) &#x25BC</a>
197197
</div>
198198

199199

docs/master/_modules/torch/ao/quantization/observer.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -193,7 +193,7 @@
193193
<div class="pytorch-left-menu-search">
194194

195195
<div class="version">
196-
<a href='https://pytorch.org/docs/versions.html'>master (1.11.0a0+git125a559 ) &#x25BC</a>
196+
<a href='https://pytorch.org/docs/versions.html'>master (1.11.0a0+gita21f2ab ) &#x25BC</a>
197197
</div>
198198

199199

docs/master/_modules/torch/ao/quantization/qconfig.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -193,7 +193,7 @@
193193
<div class="pytorch-left-menu-search">
194194

195195
<div class="version">
196-
<a href='https://pytorch.org/docs/versions.html'>master (1.11.0a0+git125a559 ) &#x25BC</a>
196+
<a href='https://pytorch.org/docs/versions.html'>master (1.11.0a0+gita21f2ab ) &#x25BC</a>
197197
</div>
198198

199199

docs/master/_modules/torch/ao/quantization/quantize.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -193,7 +193,7 @@
193193
<div class="pytorch-left-menu-search">
194194

195195
<div class="version">
196-
<a href='https://pytorch.org/docs/versions.html'>master (1.11.0a0+git125a559 ) &#x25BC</a>
196+
<a href='https://pytorch.org/docs/versions.html'>master (1.11.0a0+gita21f2ab ) &#x25BC</a>
197197
</div>
198198

199199

docs/master/_modules/torch/ao/quantization/stubs.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -193,7 +193,7 @@
193193
<div class="pytorch-left-menu-search">
194194

195195
<div class="version">
196-
<a href='https://pytorch.org/docs/versions.html'>master (1.11.0a0+git125a559 ) &#x25BC</a>
196+
<a href='https://pytorch.org/docs/versions.html'>master (1.11.0a0+gita21f2ab ) &#x25BC</a>
197197
</div>
198198

199199

docs/master/_modules/torch/autocast_mode.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -193,7 +193,7 @@
193193
<div class="pytorch-left-menu-search">
194194

195195
<div class="version">
196-
<a href='https://pytorch.org/docs/versions.html'>master (1.11.0a0+git125a559 ) &#x25BC</a>
196+
<a href='https://pytorch.org/docs/versions.html'>master (1.11.0a0+gita21f2ab ) &#x25BC</a>
197197
</div>
198198

199199

0 commit comments

Comments
 (0)