Skip to content

Commit f434c5f

Browse files
authored
Merge pull request #598 from pytorch/1.8-fx-code-delete
Update 2021-3-4-pytorch-1.8-released.md
2 parents 23eb6cd + 95ce30e commit f434c5f

File tree

1 file changed

+0
-23
lines changed

1 file changed

+0
-23
lines changed

_posts/2021-3-4-pytorch-1.8-released.md

Lines changed: 0 additions & 23 deletions
Original file line numberDiff line numberDiff line change
@@ -31,29 +31,6 @@ This kind of functionality is applicable in many scenarios. For example, the FX-
3131

3232
Because FX transforms consume and produce nn.Module instances, they can be used within many existing PyTorch workflows. This includes workflows that, for example, train in Python then deploy via TorchScript.
3333

34-
Below is an FX transform example:
35-
36-
```python
37-
import torch
38-
import torch.fx
39-
40-
def transform(m: nn.Module,
41-
tracer_class : type = torch.fx.Tracer) -> torch.nn.Module:
42-
# Step 1: Acquire a Graph representing the code in `m`
43-
44-
# NOTE: torch.fx.symbolic_trace is a wrapper around a call to
45-
# fx.Tracer.trace and constructing a GraphModule. We'll
46-
# split that out in our transform to allow the caller to
47-
# customize tracing behavior.
48-
graph : torch.fx.Graph = tracer_class().trace(m)
49-
50-
# Step 2: Modify this Graph or create a new one
51-
graph = ...
52-
53-
# Step 3: Construct a Module to return
54-
return torch.fx.GraphModule(m, graph)
55-
56-
```
5734
You can read more about FX in the official [documentation](https://pytorch.org/docs/master/fx.html). You can also find several examples of program transformations implemented using ```torch.fx``` [here](https://github.com/pytorch/examples/tree/master/fx). We are constantly improving FX and invite you to share any feedback you have about the toolkit on the [forums](https://discuss.pytorch.org/) or [issue tracker](https://github.com/pytorch/pytorch/issues).
5835

5936
# Distributed Training

0 commit comments

Comments
 (0)