Skip to content

Commit 8d31bc6

Browse files
committed
auto-generating sphinx docs
1 parent 5afc0ad commit 8d31bc6

File tree

1,143 files changed

+29631
-300375
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

1,143 files changed

+29631
-300375
lines changed

docs/master/__config__.html

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -159,7 +159,7 @@
159159

160160

161161
<div class="version">
162-
<a href='http://pytorch.org/docs/versions.html'>1.6.0a0+fd05deb &#x25BC</a>
162+
<a href='http://pytorch.org/docs/versions.html'>1.6.0a0+c0ff085 &#x25BC</a>
163163
</div>
164164

165165

@@ -234,7 +234,7 @@
234234
<li class="toctree-l1"><a class="reference internal" href="onnx.html">torch.onnx</a></li>
235235
<li class="toctree-l1"><a class="reference internal" href="optim.html">torch.optim</a></li>
236236
<li class="toctree-l1"><a class="reference internal" href="quantization.html">Quantization</a></li>
237-
<li class="toctree-l1"><a class="reference internal" href="rpc.html">Distributed RPC Framework</a></li>
237+
<li class="toctree-l1"><a class="reference internal" href="rpc/index.html">Distributed RPC Framework</a></li>
238238
<li class="toctree-l1"><a class="reference internal" href="random.html">torch.random</a></li>
239239
<li class="toctree-l1"><a class="reference internal" href="sparse.html">torch.sparse</a></li>
240240
<li class="toctree-l1"><a class="reference internal" href="storage.html">torch.Storage</a></li>

docs/master/_images/ReLU61.png

-24 KB
Binary file not shown.

docs/master/_modules/index.html

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -158,7 +158,7 @@
158158

159159

160160
<div class="version">
161-
<a href='http://pytorch.org/docs/versions.html'>1.6.0a0+fd05deb &#x25BC</a>
161+
<a href='http://pytorch.org/docs/versions.html'>1.6.0a0+c0ff085 &#x25BC</a>
162162
</div>
163163

164164

@@ -233,7 +233,7 @@
233233
<li class="toctree-l1"><a class="reference internal" href="../onnx.html">torch.onnx</a></li>
234234
<li class="toctree-l1"><a class="reference internal" href="../optim.html">torch.optim</a></li>
235235
<li class="toctree-l1"><a class="reference internal" href="../quantization.html">Quantization</a></li>
236-
<li class="toctree-l1"><a class="reference internal" href="../rpc.html">Distributed RPC Framework</a></li>
236+
<li class="toctree-l1"><a class="reference internal" href="../rpc/index.html">Distributed RPC Framework</a></li>
237237
<li class="toctree-l1"><a class="reference internal" href="../random.html">torch.random</a></li>
238238
<li class="toctree-l1"><a class="reference internal" href="../sparse.html">torch.sparse</a></li>
239239
<li class="toctree-l1"><a class="reference internal" href="../storage.html">torch.Storage</a></li>

docs/master/_modules/torch.html

Lines changed: 7 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -158,7 +158,7 @@
158158

159159

160160
<div class="version">
161-
<a href='http://pytorch.org/docs/versions.html'>1.6.0a0+fd05deb &#x25BC</a>
161+
<a href='http://pytorch.org/docs/versions.html'>1.6.0a0+c0ff085 &#x25BC</a>
162162
</div>
163163

164164

@@ -233,7 +233,7 @@
233233
<li class="toctree-l1"><a class="reference internal" href="../onnx.html">torch.onnx</a></li>
234234
<li class="toctree-l1"><a class="reference internal" href="../optim.html">torch.optim</a></li>
235235
<li class="toctree-l1"><a class="reference internal" href="../quantization.html">Quantization</a></li>
236-
<li class="toctree-l1"><a class="reference internal" href="../rpc.html">Distributed RPC Framework</a></li>
236+
<li class="toctree-l1"><a class="reference internal" href="../rpc/index.html">Distributed RPC Framework</a></li>
237237
<li class="toctree-l1"><a class="reference internal" href="../random.html">torch.random</a></li>
238238
<li class="toctree-l1"><a class="reference internal" href="../sparse.html">torch.sparse</a></li>
239239
<li class="toctree-l1"><a class="reference internal" href="../storage.html">torch.Storage</a></li>
@@ -504,7 +504,7 @@ <h1>Source code for torch</h1><div class="highlight"><pre>
504504
<span class="k">return</span> <span class="n">module</span> <span class="o">+</span> <span class="n">class_name</span>
505505

506506

507-
<div class="viewcode-block" id="is_tensor"><a class="viewcode-back" href="../generated/torch.is_tensor.html#torch.is_tensor">[docs]</a><span class="k">def</span> <span class="nf">is_tensor</span><span class="p">(</span><span class="n">obj</span><span class="p">):</span>
507+
<div class="viewcode-block" id="is_tensor"><a class="viewcode-back" href="../torch.html#torch.is_tensor">[docs]</a><span class="k">def</span> <span class="nf">is_tensor</span><span class="p">(</span><span class="n">obj</span><span class="p">):</span>
508508
<span class="sa">r</span><span class="sd">&quot;&quot;&quot;Returns True if `obj` is a PyTorch tensor.</span>
509509

510510
<span class="sd"> Args:</span>
@@ -513,7 +513,7 @@ <h1>Source code for torch</h1><div class="highlight"><pre>
513513
<span class="k">return</span> <span class="nb">isinstance</span><span class="p">(</span><span class="n">obj</span><span class="p">,</span> <span class="n">torch</span><span class="o">.</span><span class="n">Tensor</span><span class="p">)</span></div>
514514

515515

516-
<div class="viewcode-block" id="is_storage"><a class="viewcode-back" href="../generated/torch.is_storage.html#torch.is_storage">[docs]</a><span class="k">def</span> <span class="nf">is_storage</span><span class="p">(</span><span class="n">obj</span><span class="p">):</span>
516+
<div class="viewcode-block" id="is_storage"><a class="viewcode-back" href="../torch.html#torch.is_storage">[docs]</a><span class="k">def</span> <span class="nf">is_storage</span><span class="p">(</span><span class="n">obj</span><span class="p">):</span>
517517
<span class="sa">r</span><span class="sd">&quot;&quot;&quot;Returns True if `obj` is a PyTorch storage object.</span>
518518

519519
<span class="sd"> Args:</span>
@@ -522,7 +522,7 @@ <h1>Source code for torch</h1><div class="highlight"><pre>
522522
<span class="k">return</span> <span class="nb">type</span><span class="p">(</span><span class="n">obj</span><span class="p">)</span> <span class="ow">in</span> <span class="n">_storage_classes</span></div>
523523

524524

525-
<div class="viewcode-block" id="set_default_tensor_type"><a class="viewcode-back" href="../generated/torch.set_default_tensor_type.html#torch.set_default_tensor_type">[docs]</a><span class="k">def</span> <span class="nf">set_default_tensor_type</span><span class="p">(</span><span class="n">t</span><span class="p">):</span>
525+
<div class="viewcode-block" id="set_default_tensor_type"><a class="viewcode-back" href="../torch.html#torch.set_default_tensor_type">[docs]</a><span class="k">def</span> <span class="nf">set_default_tensor_type</span><span class="p">(</span><span class="n">t</span><span class="p">):</span>
526526
<span class="sa">r</span><span class="sd">&quot;&quot;&quot;Sets the default ``torch.Tensor`` type to floating point tensor type</span>
527527
<span class="sd"> ``t``. This type will also be used as default floating point type for</span>
528528
<span class="sd"> type inference in :func:`torch.tensor`.</span>
@@ -546,7 +546,7 @@ <h1>Source code for torch</h1><div class="highlight"><pre>
546546
<span class="n">_C</span><span class="o">.</span><span class="n">_set_default_tensor_type</span><span class="p">(</span><span class="n">t</span><span class="p">)</span></div>
547547

548548

549-
<div class="viewcode-block" id="set_default_dtype"><a class="viewcode-back" href="../generated/torch.set_default_dtype.html#torch.set_default_dtype">[docs]</a><span class="k">def</span> <span class="nf">set_default_dtype</span><span class="p">(</span><span class="n">d</span><span class="p">):</span>
549+
<div class="viewcode-block" id="set_default_dtype"><a class="viewcode-back" href="../torch.html#torch.set_default_dtype">[docs]</a><span class="k">def</span> <span class="nf">set_default_dtype</span><span class="p">(</span><span class="n">d</span><span class="p">):</span>
550550
<span class="sa">r</span><span class="sd">&quot;&quot;&quot;Sets the default floating point dtype to :attr:`d`. This type will be</span>
551551
<span class="sd"> used as default floating point type for type inference in</span>
552552
<span class="sd"> :func:`torch.tensor`.</span>
@@ -667,7 +667,6 @@ <h1>Source code for torch</h1><div class="highlight"><pre>
667667
<span class="k">if</span> <span class="n">name</span><span class="o">.</span><span class="n">startswith</span><span class="p">(</span><span class="s1">&#39;__&#39;</span><span class="p">):</span>
668668
<span class="k">continue</span>
669669
<span class="nb">globals</span><span class="p">()[</span><span class="n">name</span><span class="p">]</span> <span class="o">=</span> <span class="nb">getattr</span><span class="p">(</span><span class="n">_C</span><span class="o">.</span><span class="n">_VariableFunctions</span><span class="p">,</span> <span class="n">name</span><span class="p">)</span>
670-
<span class="n">__all__</span><span class="o">.</span><span class="n">append</span><span class="p">(</span><span class="n">name</span><span class="p">)</span>
671670

672671
<span class="c1">################################################################################</span>
673672
<span class="c1"># Import interface functions defined in Python</span>
@@ -731,7 +730,7 @@ <h1>Source code for torch</h1><div class="highlight"><pre>
731730
<span class="k">del</span> <span class="n">_torch_docs</span><span class="p">,</span> <span class="n">_tensor_docs</span><span class="p">,</span> <span class="n">_storage_docs</span>
732731

733732

734-
<div class="viewcode-block" id="compiled_with_cxx11_abi"><a class="viewcode-back" href="../generated/torch.compiled_with_cxx11_abi.html#torch.compiled_with_cxx11_abi">[docs]</a><span class="k">def</span> <span class="nf">compiled_with_cxx11_abi</span><span class="p">():</span>
733+
<div class="viewcode-block" id="compiled_with_cxx11_abi"><a class="viewcode-back" href="../torch.html#torch.compiled_with_cxx11_abi">[docs]</a><span class="k">def</span> <span class="nf">compiled_with_cxx11_abi</span><span class="p">():</span>
735734
<span class="sa">r</span><span class="sd">&quot;&quot;&quot;Returns whether PyTorch was built with _GLIBCXX_USE_CXX11_ABI=1&quot;&quot;&quot;</span>
736735
<span class="k">return</span> <span class="n">_C</span><span class="o">.</span><span class="n">_GLIBCXX_USE_CXX11_ABI</span></div>
737736

docs/master/_modules/torch/__config__.html

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -158,7 +158,7 @@
158158

159159

160160
<div class="version">
161-
<a href='http://pytorch.org/docs/versions.html'>1.6.0a0+fd05deb &#x25BC</a>
161+
<a href='http://pytorch.org/docs/versions.html'>1.6.0a0+c0ff085 &#x25BC</a>
162162
</div>
163163

164164

@@ -233,7 +233,7 @@
233233
<li class="toctree-l1"><a class="reference internal" href="../../onnx.html">torch.onnx</a></li>
234234
<li class="toctree-l1"><a class="reference internal" href="../../optim.html">torch.optim</a></li>
235235
<li class="toctree-l1"><a class="reference internal" href="../../quantization.html">Quantization</a></li>
236-
<li class="toctree-l1"><a class="reference internal" href="../../rpc.html">Distributed RPC Framework</a></li>
236+
<li class="toctree-l1"><a class="reference internal" href="../../rpc/index.html">Distributed RPC Framework</a></li>
237237
<li class="toctree-l1"><a class="reference internal" href="../../random.html">torch.random</a></li>
238238
<li class="toctree-l1"><a class="reference internal" href="../../sparse.html">torch.sparse</a></li>
239239
<li class="toctree-l1"><a class="reference internal" href="../../storage.html">torch.Storage</a></li>

docs/master/_modules/torch/_jit_internal.html

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -158,7 +158,7 @@
158158

159159

160160
<div class="version">
161-
<a href='http://pytorch.org/docs/versions.html'>1.6.0a0+fd05deb &#x25BC</a>
161+
<a href='http://pytorch.org/docs/versions.html'>1.6.0a0+c0ff085 &#x25BC</a>
162162
</div>
163163

164164

@@ -233,7 +233,7 @@
233233
<li class="toctree-l1"><a class="reference internal" href="../../onnx.html">torch.onnx</a></li>
234234
<li class="toctree-l1"><a class="reference internal" href="../../optim.html">torch.optim</a></li>
235235
<li class="toctree-l1"><a class="reference internal" href="../../quantization.html">Quantization</a></li>
236-
<li class="toctree-l1"><a class="reference internal" href="../../rpc.html">Distributed RPC Framework</a></li>
236+
<li class="toctree-l1"><a class="reference internal" href="../../rpc/index.html">Distributed RPC Framework</a></li>
237237
<li class="toctree-l1"><a class="reference internal" href="../../random.html">torch.random</a></li>
238238
<li class="toctree-l1"><a class="reference internal" href="../../sparse.html">torch.sparse</a></li>
239239
<li class="toctree-l1"><a class="reference internal" href="../../storage.html">torch.Storage</a></li>
@@ -634,7 +634,7 @@ <h1>Source code for torch._jit_internal</h1><div class="highlight"><pre>
634634
<span class="k">return</span> <span class="n">fn</span></div>
635635

636636

637-
<div class="viewcode-block" id="unused"><a class="viewcode-back" href="../../generated/torch.jit.unused.html#torch.jit.unused">[docs]</a><span class="k">def</span> <span class="nf">unused</span><span class="p">(</span><span class="n">fn</span><span class="p">):</span>
637+
<div class="viewcode-block" id="unused"><a class="viewcode-back" href="../../jit.html#torch.jit.unused">[docs]</a><span class="k">def</span> <span class="nf">unused</span><span class="p">(</span><span class="n">fn</span><span class="p">):</span>
638638
<span class="sd">&quot;&quot;&quot;</span>
639639
<span class="sd"> This decorator indicates to the compiler that a function or method should</span>
640640
<span class="sd"> be ignored and replaced with the raising of an exception. This allows you</span>
@@ -674,7 +674,7 @@ <h1>Source code for torch._jit_internal</h1><div class="highlight"><pre>
674674
<span class="n">fn</span><span class="o">.</span><span class="n">_torchscript_modifier</span> <span class="o">=</span> <span class="n">FunctionModifiers</span><span class="o">.</span><span class="n">UNUSED</span>
675675
<span class="k">return</span> <span class="n">fn</span></div>
676676

677-
<div class="viewcode-block" id="ignore"><a class="viewcode-back" href="../../generated/torch.jit.ignore.html#torch.jit.ignore">[docs]</a><span class="k">def</span> <span class="nf">ignore</span><span class="p">(</span><span class="n">drop</span><span class="o">=</span><span class="kc">False</span><span class="p">,</span> <span class="o">**</span><span class="n">kwargs</span><span class="p">):</span>
677+
<div class="viewcode-block" id="ignore"><a class="viewcode-back" href="../../jit.html#torch.jit.ignore">[docs]</a><span class="k">def</span> <span class="nf">ignore</span><span class="p">(</span><span class="n">drop</span><span class="o">=</span><span class="kc">False</span><span class="p">,</span> <span class="o">**</span><span class="n">kwargs</span><span class="p">):</span>
678678
<span class="sd">&quot;&quot;&quot;</span>
679679
<span class="sd"> This decorator indicates to the compiler that a function or method should</span>
680680
<span class="sd"> be ignored and left as a Python function. This allows you to leave code in</span>

docs/master/_modules/torch/_lobpcg.html

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -158,7 +158,7 @@
158158

159159

160160
<div class="version">
161-
<a href='http://pytorch.org/docs/versions.html'>1.6.0a0+fd05deb &#x25BC</a>
161+
<a href='http://pytorch.org/docs/versions.html'>1.6.0a0+c0ff085 &#x25BC</a>
162162
</div>
163163

164164

@@ -233,7 +233,7 @@
233233
<li class="toctree-l1"><a class="reference internal" href="../../onnx.html">torch.onnx</a></li>
234234
<li class="toctree-l1"><a class="reference internal" href="../../optim.html">torch.optim</a></li>
235235
<li class="toctree-l1"><a class="reference internal" href="../../quantization.html">Quantization</a></li>
236-
<li class="toctree-l1"><a class="reference internal" href="../../rpc.html">Distributed RPC Framework</a></li>
236+
<li class="toctree-l1"><a class="reference internal" href="../../rpc/index.html">Distributed RPC Framework</a></li>
237237
<li class="toctree-l1"><a class="reference internal" href="../../random.html">torch.random</a></li>
238238
<li class="toctree-l1"><a class="reference internal" href="../../sparse.html">torch.sparse</a></li>
239239
<li class="toctree-l1"><a class="reference internal" href="../../storage.html">torch.Storage</a></li>
@@ -352,7 +352,7 @@ <h1>Source code for torch._lobpcg</h1><div class="highlight"><pre>
352352
<span class="n">__all__</span> <span class="o">=</span> <span class="p">[</span><span class="s1">&#39;lobpcg&#39;</span><span class="p">]</span>
353353

354354

355-
<div class="viewcode-block" id="lobpcg"><a class="viewcode-back" href="../../generated/torch.lobpcg.html#torch.lobpcg">[docs]</a><span class="k">def</span> <span class="nf">lobpcg</span><span class="p">(</span><span class="n">A</span><span class="p">,</span> <span class="c1"># type: Tensor</span>
355+
<div class="viewcode-block" id="lobpcg"><a class="viewcode-back" href="../../torch.html#torch.lobpcg">[docs]</a><span class="k">def</span> <span class="nf">lobpcg</span><span class="p">(</span><span class="n">A</span><span class="p">,</span> <span class="c1"># type: Tensor</span>
356356
<span class="n">k</span><span class="o">=</span><span class="kc">None</span><span class="p">,</span> <span class="c1"># type: Optional[int]</span>
357357
<span class="n">B</span><span class="o">=</span><span class="kc">None</span><span class="p">,</span> <span class="c1"># type: Optional[Tensor]</span>
358358
<span class="n">X</span><span class="o">=</span><span class="kc">None</span><span class="p">,</span> <span class="c1"># type: Optional[Tensor]</span>
@@ -487,18 +487,18 @@ <h1>Source code for torch._lobpcg</h1><div class="highlight"><pre>
487487
<span class="sd"> Preconditioned Eigensolver: Locally Optimal Block Preconditioned</span>
488488
<span class="sd"> Conjugate Gradient Method. SIAM J. Sci. Comput., 23(2),</span>
489489
<span class="sd"> 517-541. (25 pages)</span>
490-
<span class="sd"> https://epubs.siam.org/doi/abs/10.1137/S1064827500366124</span>
490+
<span class="sd"> `https://epubs.siam.org/doi/abs/10.1137/S1064827500366124`_</span>
491491

492492
<span class="sd"> [StathopoulosEtal2002] Andreas Stathopoulos and Kesheng</span>
493493
<span class="sd"> Wu. (2002) A Block Orthogonalization Procedure with Constant</span>
494494
<span class="sd"> Synchronization Requirements. SIAM J. Sci. Comput., 23(6),</span>
495495
<span class="sd"> 2165-2182. (18 pages)</span>
496-
<span class="sd"> https://epubs.siam.org/doi/10.1137/S1064827500370883</span>
496+
<span class="sd"> `https://epubs.siam.org/doi/10.1137/S1064827500370883`_</span>
497497

498498
<span class="sd"> [DuerschEtal2018] Jed A. Duersch, Meiyue Shao, Chao Yang, Ming</span>
499499
<span class="sd"> Gu. (2018) A Robust and Efficient Implementation of LOBPCG.</span>
500500
<span class="sd"> SIAM J. Sci. Comput., 40(5), C655-C676. (22 pages)</span>
501-
<span class="sd"> https://epubs.siam.org/doi/abs/10.1137/17M1129830</span>
501+
<span class="sd"> `https://epubs.siam.org/doi/abs/10.1137/17M1129830`_</span>
502502

503503
<span class="sd"> &quot;&quot;&quot;</span>
504504

0 commit comments

Comments
 (0)