Skip to content

Commit 781291a

Browse files
committed
Generate Python docs from pytorch/pytorch@adf8b7e
1 parent 7b4e25e commit 781291a

File tree

819 files changed

+5905
-1339
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

819 files changed

+5905
-1339
lines changed

docs/master/__config__.html

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -188,7 +188,7 @@
188188

189189

190190
<div class="version">
191-
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+49b090d &#x25BC</a>
191+
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+adf8b7e &#x25BC</a>
192192
</div>
193193

194194

@@ -270,6 +270,7 @@
270270
<li class="toctree-l1"><a class="reference internal" href="onnx.html">torch.onnx</a></li>
271271
<li class="toctree-l1"><a class="reference internal" href="optim.html">torch.optim</a></li>
272272
<li class="toctree-l1"><a class="reference internal" href="complex_numbers.html">Complex Numbers</a></li>
273+
<li class="toctree-l1"><a class="reference internal" href="pipeline.html">Pipeline Parallelism</a></li>
273274
<li class="toctree-l1"><a class="reference internal" href="quantization.html">Quantization</a></li>
274275
<li class="toctree-l1"><a class="reference internal" href="rpc.html">Distributed RPC Framework</a></li>
275276
<li class="toctree-l1"><a class="reference internal" href="random.html">torch.random</a></li>

docs/master/_images/no_pipe.png

14.3 KB
Loading

docs/master/_images/pipe.png

37.1 KB
Loading

docs/master/_modules/index.html

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -187,7 +187,7 @@
187187

188188

189189
<div class="version">
190-
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+49b090d &#x25BC</a>
190+
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+adf8b7e &#x25BC</a>
191191
</div>
192192

193193

@@ -269,6 +269,7 @@
269269
<li class="toctree-l1"><a class="reference internal" href="../onnx.html">torch.onnx</a></li>
270270
<li class="toctree-l1"><a class="reference internal" href="../optim.html">torch.optim</a></li>
271271
<li class="toctree-l1"><a class="reference internal" href="../complex_numbers.html">Complex Numbers</a></li>
272+
<li class="toctree-l1"><a class="reference internal" href="../pipeline.html">Pipeline Parallelism</a></li>
272273
<li class="toctree-l1"><a class="reference internal" href="../quantization.html">Quantization</a></li>
273274
<li class="toctree-l1"><a class="reference internal" href="../rpc.html">Distributed RPC Framework</a></li>
274275
<li class="toctree-l1"><a class="reference internal" href="../random.html">torch.random</a></li>
@@ -410,6 +411,8 @@ <h1>All modules for which code is available</h1>
410411
<ul><li><a href="torch/distributed/autograd.html">torch.distributed.autograd</a></li>
411412
<li><a href="torch/distributed/distributed_c10d.html">torch.distributed.distributed_c10d</a></li>
412413
<li><a href="torch/distributed/optim/optimizer.html">torch.distributed.optim.optimizer</a></li>
414+
<li><a href="torch/distributed/pipeline/sync/pipe.html">torch.distributed.pipeline.sync.pipe</a></li>
415+
<li><a href="torch/distributed/pipeline/sync/skip/skippable.html">torch.distributed.pipeline.sync.skip.skippable</a></li>
413416
<li><a href="torch/distributed/rpc.html">torch.distributed.rpc</a></li>
414417
<ul><li><a href="torch/distributed/rpc/api.html">torch.distributed.rpc.api</a></li>
415418
<li><a href="torch/distributed/rpc/backend_registry.html">torch.distributed.rpc.backend_registry</a></li>

docs/master/_modules/torch.html

Lines changed: 12 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -187,7 +187,7 @@
187187

188188

189189
<div class="version">
190-
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+49b090d &#x25BC</a>
190+
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+adf8b7e &#x25BC</a>
191191
</div>
192192

193193

@@ -269,6 +269,7 @@
269269
<li class="toctree-l1"><a class="reference internal" href="../onnx.html">torch.onnx</a></li>
270270
<li class="toctree-l1"><a class="reference internal" href="../optim.html">torch.optim</a></li>
271271
<li class="toctree-l1"><a class="reference internal" href="../complex_numbers.html">Complex Numbers</a></li>
272+
<li class="toctree-l1"><a class="reference internal" href="../pipeline.html">Pipeline Parallelism</a></li>
272273
<li class="toctree-l1"><a class="reference internal" href="../quantization.html">Quantization</a></li>
273274
<li class="toctree-l1"><a class="reference internal" href="../rpc.html">Distributed RPC Framework</a></li>
274275
<li class="toctree-l1"><a class="reference internal" href="../random.html">torch.random</a></li>
@@ -626,7 +627,7 @@ <h1>Source code for torch</h1><div class="highlight"><pre>
626627
<span class="k">return</span> <span class="n">module</span> <span class="o">+</span> <span class="n">class_name</span>
627628

628629

629-
<div class="viewcode-block" id="is_tensor"><a class="viewcode-back" href="../generated/torch.is_tensor.html#torch.is_tensor">[docs]</a><span class="k">def</span> <span class="nf">is_tensor</span><span class="p">(</span><span class="n">obj</span><span class="p">):</span>
630+
<span class="k">def</span> <span class="nf">is_tensor</span><span class="p">(</span><span class="n">obj</span><span class="p">):</span>
630631
<span class="sa">r</span><span class="sd">&quot;&quot;&quot;Returns True if `obj` is a PyTorch tensor.</span>
631632

632633
<span class="sd"> Note that this function is simply doing ``isinstance(obj, Tensor)``.</span>
@@ -637,19 +638,19 @@ <h1>Source code for torch</h1><div class="highlight"><pre>
637638
<span class="sd"> Args:</span>
638639
<span class="sd"> obj (Object): Object to test</span>
639640
<span class="sd"> &quot;&quot;&quot;</span>
640-
<span class="k">return</span> <span class="nb">isinstance</span><span class="p">(</span><span class="n">obj</span><span class="p">,</span> <span class="n">torch</span><span class="o">.</span><span class="n">Tensor</span><span class="p">)</span></div>
641+
<span class="k">return</span> <span class="nb">isinstance</span><span class="p">(</span><span class="n">obj</span><span class="p">,</span> <span class="n">torch</span><span class="o">.</span><span class="n">Tensor</span><span class="p">)</span>
641642

642643

643-
<div class="viewcode-block" id="is_storage"><a class="viewcode-back" href="../generated/torch.is_storage.html#torch.is_storage">[docs]</a><span class="k">def</span> <span class="nf">is_storage</span><span class="p">(</span><span class="n">obj</span><span class="p">):</span>
644+
<span class="k">def</span> <span class="nf">is_storage</span><span class="p">(</span><span class="n">obj</span><span class="p">):</span>
644645
<span class="sa">r</span><span class="sd">&quot;&quot;&quot;Returns True if `obj` is a PyTorch storage object.</span>
645646

646647
<span class="sd"> Args:</span>
647648
<span class="sd"> obj (Object): Object to test</span>
648649
<span class="sd"> &quot;&quot;&quot;</span>
649-
<span class="k">return</span> <span class="nb">type</span><span class="p">(</span><span class="n">obj</span><span class="p">)</span> <span class="ow">in</span> <span class="n">_storage_classes</span></div>
650+
<span class="k">return</span> <span class="nb">type</span><span class="p">(</span><span class="n">obj</span><span class="p">)</span> <span class="ow">in</span> <span class="n">_storage_classes</span>
650651

651652

652-
<span class="k">def</span> <span class="nf">set_default_tensor_type</span><span class="p">(</span><span class="n">t</span><span class="p">):</span>
653+
<div class="viewcode-block" id="set_default_tensor_type"><a class="viewcode-back" href="../generated/torch.set_default_tensor_type.html#torch.set_default_tensor_type">[docs]</a><span class="k">def</span> <span class="nf">set_default_tensor_type</span><span class="p">(</span><span class="n">t</span><span class="p">):</span>
653654
<span class="sa">r</span><span class="sd">&quot;&quot;&quot;Sets the default ``torch.Tensor`` type to floating point tensor type</span>
654655
<span class="sd"> ``t``. This type will also be used as default floating point type for</span>
655656
<span class="sd"> type inference in :func:`torch.tensor`.</span>
@@ -670,10 +671,10 @@ <h1>Source code for torch</h1><div class="highlight"><pre>
670671
<span class="sd"> &quot;&quot;&quot;</span>
671672
<span class="k">if</span> <span class="nb">isinstance</span><span class="p">(</span><span class="n">t</span><span class="p">,</span> <span class="n">_string_classes</span><span class="p">):</span>
672673
<span class="n">t</span> <span class="o">=</span> <span class="n">_import_dotted_name</span><span class="p">(</span><span class="n">t</span><span class="p">)</span>
673-
<span class="n">_C</span><span class="o">.</span><span class="n">_set_default_tensor_type</span><span class="p">(</span><span class="n">t</span><span class="p">)</span>
674+
<span class="n">_C</span><span class="o">.</span><span class="n">_set_default_tensor_type</span><span class="p">(</span><span class="n">t</span><span class="p">)</span></div>
674675

675676

676-
<span class="k">def</span> <span class="nf">set_default_dtype</span><span class="p">(</span><span class="n">d</span><span class="p">):</span>
677+
<div class="viewcode-block" id="set_default_dtype"><a class="viewcode-back" href="../generated/torch.set_default_dtype.html#torch.set_default_dtype">[docs]</a><span class="k">def</span> <span class="nf">set_default_dtype</span><span class="p">(</span><span class="n">d</span><span class="p">):</span>
677678
<span class="sa">r</span><span class="sd">&quot;&quot;&quot;Sets the default floating point dtype to :attr:`d`.</span>
678679
<span class="sd"> This dtype is:</span>
679680

@@ -701,9 +702,9 @@ <h1>Source code for torch</h1><div class="highlight"><pre>
701702
<span class="sd"> torch.complex128</span>
702703

703704
<span class="sd"> &quot;&quot;&quot;</span>
704-
<span class="n">_C</span><span class="o">.</span><span class="n">_set_default_dtype</span><span class="p">(</span><span class="n">d</span><span class="p">)</span>
705+
<span class="n">_C</span><span class="o">.</span><span class="n">_set_default_dtype</span><span class="p">(</span><span class="n">d</span><span class="p">)</span></div>
705706

706-
<div class="viewcode-block" id="use_deterministic_algorithms"><a class="viewcode-back" href="../generated/torch.use_deterministic_algorithms.html#torch.use_deterministic_algorithms">[docs]</a><span class="k">def</span> <span class="nf">use_deterministic_algorithms</span><span class="p">(</span><span class="n">d</span><span class="p">):</span>
707+
<span class="k">def</span> <span class="nf">use_deterministic_algorithms</span><span class="p">(</span><span class="n">d</span><span class="p">):</span>
707708
<span class="sa">r</span><span class="sd">&quot;&quot;&quot; Sets whether PyTorch operations must use &quot;deterministic&quot;</span>
708709
<span class="sd"> algorithms. That is, algorithms which, given the same input, and when</span>
709710
<span class="sd"> run on the same software and hardware, always produce the same output.</span>
@@ -780,7 +781,7 @@ <h1>Source code for torch</h1><div class="highlight"><pre>
780781
<span class="sd"> d (:class:`bool`): If True, force operations to be deterministic.</span>
781782
<span class="sd"> If False, allow non-deterministic operations.</span>
782783
<span class="sd"> &quot;&quot;&quot;</span>
783-
<span class="n">_C</span><span class="o">.</span><span class="n">_set_deterministic_algorithms</span><span class="p">(</span><span class="n">d</span><span class="p">)</span></div>
784+
<span class="n">_C</span><span class="o">.</span><span class="n">_set_deterministic_algorithms</span><span class="p">(</span><span class="n">d</span><span class="p">)</span>
784785

785786
<span class="k">def</span> <span class="nf">set_deterministic</span><span class="p">(</span><span class="n">d</span><span class="p">):</span>
786787
<span class="sa">r</span><span class="sd">&quot;&quot;&quot;This function is deprecated and will be removed in a future release.</span>

docs/master/_modules/torch/__config__.html

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -187,7 +187,7 @@
187187

188188

189189
<div class="version">
190-
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+49b090d &#x25BC</a>
190+
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+adf8b7e &#x25BC</a>
191191
</div>
192192

193193

@@ -269,6 +269,7 @@
269269
<li class="toctree-l1"><a class="reference internal" href="../../onnx.html">torch.onnx</a></li>
270270
<li class="toctree-l1"><a class="reference internal" href="../../optim.html">torch.optim</a></li>
271271
<li class="toctree-l1"><a class="reference internal" href="../../complex_numbers.html">Complex Numbers</a></li>
272+
<li class="toctree-l1"><a class="reference internal" href="../../pipeline.html">Pipeline Parallelism</a></li>
272273
<li class="toctree-l1"><a class="reference internal" href="../../quantization.html">Quantization</a></li>
273274
<li class="toctree-l1"><a class="reference internal" href="../../rpc.html">Distributed RPC Framework</a></li>
274275
<li class="toctree-l1"><a class="reference internal" href="../../random.html">torch.random</a></li>

docs/master/_modules/torch/_jit_internal.html

Lines changed: 6 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -187,7 +187,7 @@
187187

188188

189189
<div class="version">
190-
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+49b090d &#x25BC</a>
190+
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+adf8b7e &#x25BC</a>
191191
</div>
192192

193193

@@ -269,6 +269,7 @@
269269
<li class="toctree-l1"><a class="reference internal" href="../../onnx.html">torch.onnx</a></li>
270270
<li class="toctree-l1"><a class="reference internal" href="../../optim.html">torch.optim</a></li>
271271
<li class="toctree-l1"><a class="reference internal" href="../../complex_numbers.html">Complex Numbers</a></li>
272+
<li class="toctree-l1"><a class="reference internal" href="../../pipeline.html">Pipeline Parallelism</a></li>
272273
<li class="toctree-l1"><a class="reference internal" href="../../quantization.html">Quantization</a></li>
273274
<li class="toctree-l1"><a class="reference internal" href="../../rpc.html">Distributed RPC Framework</a></li>
274275
<li class="toctree-l1"><a class="reference internal" href="../../random.html">torch.random</a></li>
@@ -824,7 +825,7 @@ <h1>Source code for torch._jit_internal</h1><div class="highlight"><pre>
824825
<span class="k">return</span> <span class="n">fn</span></div>
825826

826827

827-
<span class="k">def</span> <span class="nf">unused</span><span class="p">(</span><span class="n">fn</span><span class="p">):</span>
828+
<div class="viewcode-block" id="unused"><a class="viewcode-back" href="../../generated/torch.jit.unused.html#torch.jit.unused">[docs]</a><span class="k">def</span> <span class="nf">unused</span><span class="p">(</span><span class="n">fn</span><span class="p">):</span>
828829
<span class="sd">&quot;&quot;&quot;</span>
829830
<span class="sd"> This decorator indicates to the compiler that a function or method should</span>
830831
<span class="sd"> be ignored and replaced with the raising of an exception. This allows you</span>
@@ -871,9 +872,9 @@ <h1>Source code for torch._jit_internal</h1><div class="highlight"><pre>
871872
<span class="k">return</span> <span class="n">prop</span>
872873

873874
<span class="n">fn</span><span class="o">.</span><span class="n">_torchscript_modifier</span> <span class="o">=</span> <span class="n">FunctionModifiers</span><span class="o">.</span><span class="n">UNUSED</span>
874-
<span class="k">return</span> <span class="n">fn</span>
875+
<span class="k">return</span> <span class="n">fn</span></div>
875876

876-
<div class="viewcode-block" id="ignore"><a class="viewcode-back" href="../../generated/torch.jit.ignore.html#torch.jit.ignore">[docs]</a><span class="k">def</span> <span class="nf">ignore</span><span class="p">(</span><span class="n">drop</span><span class="o">=</span><span class="kc">False</span><span class="p">,</span> <span class="o">**</span><span class="n">kwargs</span><span class="p">):</span>
877+
<span class="k">def</span> <span class="nf">ignore</span><span class="p">(</span><span class="n">drop</span><span class="o">=</span><span class="kc">False</span><span class="p">,</span> <span class="o">**</span><span class="n">kwargs</span><span class="p">):</span>
877878
<span class="sd">&quot;&quot;&quot;</span>
878879
<span class="sd"> This decorator indicates to the compiler that a function or method should</span>
879880
<span class="sd"> be ignored and left as a Python function. This allows you to leave code in</span>
@@ -964,7 +965,7 @@ <h1>Source code for torch._jit_internal</h1><div class="highlight"><pre>
964965
<span class="k">else</span><span class="p">:</span>
965966
<span class="n">fn</span><span class="o">.</span><span class="n">_torchscript_modifier</span> <span class="o">=</span> <span class="n">FunctionModifiers</span><span class="o">.</span><span class="n">IGNORE</span>
966967
<span class="k">return</span> <span class="n">fn</span>
967-
<span class="k">return</span> <span class="n">decorator</span></div>
968+
<span class="k">return</span> <span class="n">decorator</span>
968969

969970

970971
<span class="k">def</span> <span class="nf">_copy_to_script_wrapper</span><span class="p">(</span><span class="n">fn</span><span class="p">):</span>

docs/master/_modules/torch/_lobpcg.html

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -187,7 +187,7 @@
187187

188188

189189
<div class="version">
190-
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+49b090d &#x25BC</a>
190+
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+adf8b7e &#x25BC</a>
191191
</div>
192192

193193

@@ -269,6 +269,7 @@
269269
<li class="toctree-l1"><a class="reference internal" href="../../onnx.html">torch.onnx</a></li>
270270
<li class="toctree-l1"><a class="reference internal" href="../../optim.html">torch.optim</a></li>
271271
<li class="toctree-l1"><a class="reference internal" href="../../complex_numbers.html">Complex Numbers</a></li>
272+
<li class="toctree-l1"><a class="reference internal" href="../../pipeline.html">Pipeline Parallelism</a></li>
272273
<li class="toctree-l1"><a class="reference internal" href="../../quantization.html">Quantization</a></li>
273274
<li class="toctree-l1"><a class="reference internal" href="../../rpc.html">Distributed RPC Framework</a></li>
274275
<li class="toctree-l1"><a class="reference internal" href="../../random.html">torch.random</a></li>

docs/master/_modules/torch/_lowrank.html

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -187,7 +187,7 @@
187187

188188

189189
<div class="version">
190-
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+49b090d &#x25BC</a>
190+
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+adf8b7e &#x25BC</a>
191191
</div>
192192

193193

@@ -269,6 +269,7 @@
269269
<li class="toctree-l1"><a class="reference internal" href="../../onnx.html">torch.onnx</a></li>
270270
<li class="toctree-l1"><a class="reference internal" href="../../optim.html">torch.optim</a></li>
271271
<li class="toctree-l1"><a class="reference internal" href="../../complex_numbers.html">Complex Numbers</a></li>
272+
<li class="toctree-l1"><a class="reference internal" href="../../pipeline.html">Pipeline Parallelism</a></li>
272273
<li class="toctree-l1"><a class="reference internal" href="../../quantization.html">Quantization</a></li>
273274
<li class="toctree-l1"><a class="reference internal" href="../../rpc.html">Distributed RPC Framework</a></li>
274275
<li class="toctree-l1"><a class="reference internal" href="../../random.html">torch.random</a></li>

docs/master/_modules/torch/_tensor_str.html

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -187,7 +187,7 @@
187187

188188

189189
<div class="version">
190-
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+49b090d &#x25BC</a>
190+
<a href='http://pytorch.org/docs/versions.html'>1.8.0a0+adf8b7e &#x25BC</a>
191191
</div>
192192

193193

@@ -269,6 +269,7 @@
269269
<li class="toctree-l1"><a class="reference internal" href="../../onnx.html">torch.onnx</a></li>
270270
<li class="toctree-l1"><a class="reference internal" href="../../optim.html">torch.optim</a></li>
271271
<li class="toctree-l1"><a class="reference internal" href="../../complex_numbers.html">Complex Numbers</a></li>
272+
<li class="toctree-l1"><a class="reference internal" href="../../pipeline.html">Pipeline Parallelism</a></li>
272273
<li class="toctree-l1"><a class="reference internal" href="../../quantization.html">Quantization</a></li>
273274
<li class="toctree-l1"><a class="reference internal" href="../../rpc.html">Distributed RPC Framework</a></li>
274275
<li class="toctree-l1"><a class="reference internal" href="../../random.html">torch.random</a></li>

0 commit comments

Comments
 (0)