We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
1 parent daa7f99 commit 30a877bCopy full SHA for 30a877b
torch/optim/lr_scheduler.py
@@ -1390,7 +1390,7 @@ def load_state_dict(self, state_dict):
1390
class CosineAnnealingWarmRestarts(LRScheduler):
1391
r"""Set the learning rate of each parameter group using a cosine annealing schedule.
1392
1393
- Where :math:`\eta_{max}` is set to the initial lr, :math:`T_{cur}`
+ :math:`\eta_{max}` is set to the initial lr, :math:`T_{cur}`
1394
is the number of epochs since the last restart and :math:`T_{i}` is the number
1395
of epochs between two warm restarts in SGDR:
1396
0 commit comments