pytorch/torch/optim
Masaki Kozuki 22ea21da3d Change 1D Tensor of 1 element to 0D Tensor (#96994)
add 0d tensor to graph adam/adamw test

Affected:
- `torch.cuda.amp.GradScaler`'s `found_inf`, `_scale`, and `_growth_tracker`
- `step` of Adam & AdamW of `capturable`

Fixes #96776 🤞

Pull Request resolved: https://github.com/pytorch/pytorch/pull/96994
Approved by: https://github.com/janeyx99
2023-03-21 18:24:19 +00:00
..
_multi_tensor
__init__.py
__init__.pyi
_functional.py
adadelta.py
adadelta.pyi
adagrad.py
adagrad.pyi
adam.py Change 1D Tensor of 1 element to 0D Tensor (#96994) 2023-03-21 18:24:19 +00:00
adam.pyi
adamax.py
adamax.pyi
adamw.py Change 1D Tensor of 1 element to 0D Tensor (#96994) 2023-03-21 18:24:19 +00:00
adamw.pyi
asgd.py
asgd.pyi
lbfgs.py
lbfgs.pyi
lr_scheduler.py
lr_scheduler.pyi
nadam.py
nadam.pyi
optimizer.py Allow fused optimizers to call _foreach_zero_ in zero_grad (#97159) 2023-03-20 19:03:26 +00:00
optimizer.pyi
radam.py
radam.pyi
rmsprop.py
rmsprop.pyi
rprop.py
rprop.pyi
sgd.py
sgd.pyi
sparse_adam.py
sparse_adam.pyi
swa_utils.py
swa_utils.pyi