pytorch/torch/optim
Vincent Quenneville-Belair e4fba752cb fix type annotation
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/26930

Test Plan: Imported from OSS

Differential Revision: D17614745

Pulled By: vincentqb

fbshipit-source-id: 1c29543f74d9cf307e9665aa890b4830b886fe63
2019-09-27 13:39:36 -07:00
..
__init__.py
__init__.pyi
adadelta.py
adagrad.py Add epsilon argument to Adagrad optimizer (#24980) 2019-08-21 16:36:51 -07:00
adam.py Adam/AdamW implementation minor fix (#22628) 2019-08-01 11:42:04 -07:00
adam.pyi
adamax.py
adamw.py Adam/AdamW implementation minor fix (#22628) 2019-08-01 11:42:04 -07:00
asgd.py
lbfgs.py change LBFGS's default tolerance_grad to 1e-7 (#25240) 2019-08-28 16:46:04 -07:00
lr_scheduler.py Resolve #25605 cyclic reference in _LRScheduler (#25776) 2019-09-18 06:08:35 -07:00
lr_scheduler.pyi fix type annotation 2019-09-27 13:39:36 -07:00
optimizer.py
optimizer.pyi
rmsprop.py Highlighting in the doc that square root comes before adding epsilon 2019-09-25 15:52:28 -07:00
rprop.py
sgd.py Updated SGD docs with subscripts (#23985) 2019-08-09 10:32:40 -07:00
sgd.pyi
sparse_adam.py