Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 10 additions & 0 deletions beginner_source/examples_autograd/polynomial_custom_function.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,16 @@

In this implementation we implement our own custom autograd function to perform
:math:`P_3'(x)`. By mathematics, :math:`P_3'(x)=\\frac{3}{2}\\left(5x^2-1\\right)`

.. note::
This example is designed to demonstrate the mechanics of gradient descent and
backpropagation, not to achieve a perfect fit. A third-degree polynomial has
fundamental limitations in approximating :math:`\sin(x)` over the range
:math:`[-\pi, \pi]`. The Taylor series for sine requires higher-order terms
(5th, 7th degree, etc.) for better accuracy. The resulting polynomial will
fit reasonably well near zero but will diverge from :math:`\sin(x)` as you
approach :math:`\pm\pi`. This is expected and illustrates the importance of
choosing an appropriate model architecture for your problem.
"""
import torch
import math
Expand Down
10 changes: 10 additions & 0 deletions beginner_source/examples_nn/polynomial_module.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,16 @@
This implementation defines the model as a custom Module subclass. Whenever you
want a model more complex than a simple sequence of existing Modules you will
need to define your model this way.

.. note::
This example is designed to demonstrate the mechanics of gradient descent and
backpropagation, not to achieve a perfect fit. A third-degree polynomial has
fundamental limitations in approximating :math:`\sin(x)` over the range
:math:`[-\pi, \pi]`. The Taylor series for sine requires higher-order terms
(5th, 7th degree, etc.) for better accuracy. The resulting polynomial will
fit reasonably well near zero but will diverge from :math:`\sin(x)` as you
approach :math:`\pm\pi`. This is expected and illustrates the importance of
choosing an appropriate model architecture for your problem.
"""
import torch
import math
Expand Down
10 changes: 10 additions & 0 deletions beginner_source/examples_nn/polynomial_nn.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,16 @@
this is where the nn package can help. The nn package defines a set of Modules,
which you can think of as a neural network layer that produces output from
input and may have some trainable weights.

.. note::
This example is designed to demonstrate the mechanics of gradient descent and
backpropagation, not to achieve a perfect fit. A third-degree polynomial has
fundamental limitations in approximating :math:`\sin(x)` over the range
:math:`[-\pi, \pi]`. The Taylor series for sine requires higher-order terms
(5th, 7th degree, etc.) for better accuracy. The resulting polynomial will
fit reasonably well near zero but will diverge from :math:`\sin(x)` as you
approach :math:`\pm\pi`. This is expected and illustrates the importance of
choosing an appropriate model architecture for your problem.
"""
import torch
import math
Expand Down
10 changes: 10 additions & 0 deletions beginner_source/examples_nn/polynomial_optim.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,16 @@
we use the optim package to define an Optimizer that will update the weights
for us. The optim package defines many optimization algorithms that are commonly
used for deep learning, including SGD+momentum, RMSProp, Adam, etc.

.. note::
This example is designed to demonstrate the mechanics of gradient descent and
backpropagation, not to achieve a perfect fit. A third-degree polynomial has
fundamental limitations in approximating :math:`\sin(x)` over the range
:math:`[-\pi, \pi]`. The Taylor series for sine requires higher-order terms
(5th, 7th degree, etc.) for better accuracy. The resulting polynomial will
fit reasonably well near zero but will diverge from :math:`\sin(x)` as you
approach :math:`\pm\pi`. This is expected and illustrates the importance of
choosing an appropriate model architecture for your problem.
"""
import torch
import math
Expand Down
10 changes: 10 additions & 0 deletions beginner_source/examples_tensor/polynomial_numpy.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,16 @@
A numpy array is a generic n-dimensional array; it does not know anything about
deep learning or gradients or computational graphs, and is just a way to perform
generic numeric computations.

.. note::
This example is designed to demonstrate the mechanics of gradient descent and
backpropagation, not to achieve a perfect fit. A third-degree polynomial has
fundamental limitations in approximating :math:`\sin(x)` over the range
:math:`[-\pi, \pi]`. The Taylor series for sine requires higher-order terms
(5th, 7th degree, etc.) for better accuracy. The resulting polynomial will
fit reasonably well near zero but will diverge from :math:`\sin(x)` as you
approach :math:`\pm\pi`. This is expected and illustrates the importance of
choosing an appropriate model architecture for your problem.
"""
import numpy as np
import math
Expand Down
10 changes: 10 additions & 0 deletions beginner_source/examples_tensor/polynomial_tensor.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,16 @@
The biggest difference between a numpy array and a PyTorch Tensor is that
a PyTorch Tensor can run on either CPU or GPU. To run operations on the GPU,
just cast the Tensor to a cuda datatype.

.. note::
This example is designed to demonstrate the mechanics of gradient descent and
backpropagation, not to achieve a perfect fit. A third-degree polynomial has
fundamental limitations in approximating :math:`\sin(x)` over the range
:math:`[-\pi, \pi]`. The Taylor series for sine requires higher-order terms
(5th, 7th degree, etc.) for better accuracy. The resulting polynomial will
fit reasonably well near zero but will diverge from :math:`\sin(x)` as you
approach :math:`\pm\pi`. This is expected and illustrates the importance of
choosing an appropriate model architecture for your problem.
"""

import torch
Expand Down