You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
# In this tutorial, we train a ``nn.TransformerEncoder`` model on a
32
-
# language modeling task. The language modeling task is to assign a
32
+
# language modeling task. Please note that this tutorial does not cover
33
+
# the training of `nn.TransformerDecoder <https://pytorch.org/docs/stable/generated/torch.nn.TransformerDecoder.html#torch.nn.TransformerDecoder>`__, as depicted in
34
+
# the right half of the diagram above. The language modeling task is to assign a
33
35
# probability for the likelihood of a given word (or a sequence of words)
34
36
# to follow a sequence of words. A sequence of tokens are passed to the embedding
35
37
# layer first, followed by a positional encoding layer to account for the order
0 commit comments