diff --git a/_posts/2021-3-4-pytorch-1.8-released.md b/_posts/2021-3-4-pytorch-1.8-released.md index c2eaea405218..84d17c37d4bc 100644 --- a/_posts/2021-3-4-pytorch-1.8-released.md +++ b/_posts/2021-3-4-pytorch-1.8-released.md @@ -31,29 +31,6 @@ This kind of functionality is applicable in many scenarios. For example, the FX- Because FX transforms consume and produce nn.Module instances, they can be used within many existing PyTorch workflows. This includes workflows that, for example, train in Python then deploy via TorchScript. -Below is an FX transform example: - -```python -import torch -import torch.fx - -def transform(m: nn.Module, - tracer_class : type = torch.fx.Tracer) -> torch.nn.Module: - # Step 1: Acquire a Graph representing the code in `m` - - # NOTE: torch.fx.symbolic_trace is a wrapper around a call to - # fx.Tracer.trace and constructing a GraphModule. We'll - # split that out in our transform to allow the caller to - # customize tracing behavior. - graph : torch.fx.Graph = tracer_class().trace(m) - - # Step 2: Modify this Graph or create a new one - graph = ... - - # Step 3: Construct a Module to return - return torch.fx.GraphModule(m, graph) - -``` You can read more about FX in the official [documentation](https://pytorch.org/docs/master/fx.html). You can also find several examples of program transformations implemented using ```torch.fx``` [here](https://github.com/pytorch/examples/tree/master/fx). We are constantly improving FX and invite you to share any feedback you have about the toolkit on the [forums](https://discuss.pytorch.org/) or [issue tracker](https://github.com/pytorch/pytorch/issues). # Distributed Training