pytorch/torch/optim
Michael Lazos 1fd4757fdc Support tensor betas in Adam and AdamW (#134171)
Adds support for beta1 and beta2 to be wrapped in tensor for Adam and AdamW.

Fixes https://github.com/pytorch/pytorch/issues/133898

Pull Request resolved: https://github.com/pytorch/pytorch/pull/134171
Approved by: https://github.com/janeyx99
2024-11-15 21:55:55 +00:00
..
_multi_tensor
__init__.py
_adafactor.py Add ScalarList overload to _foreach_lerp (#134482) 2024-11-12 19:03:41 +00:00
_functional.py
adadelta.py
adagrad.py
adam.py Support tensor betas in Adam and AdamW (#134171) 2024-11-15 21:55:55 +00:00
adamax.py
adamw.py Support tensor betas in Adam and AdamW (#134171) 2024-11-15 21:55:55 +00:00
asgd.py
lbfgs.py
lr_scheduler.py
nadam.py
optimizer.py Support tensor betas in Adam and AdamW (#134171) 2024-11-15 21:55:55 +00:00
radam.py
rmsprop.py
rprop.py
sgd.py
sparse_adam.py
swa_utils.py