pytorch/torch/optim
Pavel Belevich 7b229342ca Renamed CosineAnnealingLr to CosineAnnealingLR (#23242)
Summary:
fixing https://github.com/pytorch/pytorch/issues/23160
Pull Request resolved: https://github.com/pytorch/pytorch/pull/23242

Differential Revision: D16443348

Pulled By: pbelevich

fbshipit-source-id: af0edf4e841e04a8016c98bfee72696581f3f070
2019-07-23 14:54:15 -07:00
..
__init__.py Implement AdamW optimizer (#21250) 2019-07-02 09:09:10 -07:00
__init__.pyi
adadelta.py
adagrad.py Revert D14577575: [pytorch][PR] Fix lack of state init for adagrad and add share_memory flag 2019-04-26 15:43:04 -07:00
adam.py
adam.pyi
adamax.py
adamw.py Implement AdamW optimizer (#21250) 2019-07-02 09:09:10 -07:00
asgd.py
lbfgs.py Use lower case for strong wolfe option. (#22092) 2019-06-26 08:20:25 -07:00
lr_scheduler.py Fix momentum bug in CyclicLR (#20401) 2019-06-11 15:10:28 -07:00
lr_scheduler.pyi Renamed CosineAnnealingLr to CosineAnnealingLR (#23242) 2019-07-23 14:54:15 -07:00
optimizer.py Lightweight at-most-once logging for API usage (#20745) 2019-05-23 23:17:59 -07:00
optimizer.pyi Fix optimizer type hint (#20648) 2019-05-22 11:27:40 -07:00
rmsprop.py
rprop.py
sgd.py
sgd.pyi
sparse_adam.py