pytorch/torch/optim
Matthew Hoffman fdf1451bfa Add __all__ to torch.optim to define public interface (#131959)
There was a regression in the public interface for `torch.optim` introduced in #125452 when `torch/optim/__init__.pyi` was merged into `torch/optim/__init__.py`. [The import aliases were not preserved and so now `pyright` thinks that these classes are not publicly exported from `torch/optim/__init__.py`.](https://github.com/pytorch/pytorch/pull/125452/files#diff-941595c1e1aa06bec94578499dd3654532a5183d0bc1bcd94d1f33b47e0d0adfL1-L15)

```
error: "SGD" is not exported from module "torch.optim"
```

Adding these classes/modules to `__all__` fixes this.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/131959
Approved by: https://github.com/ezyang
2024-07-27 01:03:25 +00:00
..
_multi_tensor
__init__.py Add __all__ to torch.optim to define public interface (#131959) 2024-07-27 01:03:25 +00:00
_adafactor.py Adafactor forloop basic impl (#129905) 2024-07-25 13:17:19 +00:00
_functional.py
adadelta.py Revert "[BE] typing for decorators - optim/optimizer (#131583)" 2024-07-26 13:41:22 +00:00
adagrad.py [bug] Add is_compiling check for optimizers to avoid untracked tensor during graph tracing (#130909) 2024-07-24 08:29:27 +00:00
adam.py Revert "[BE] remove unnecessary _dispatch_sqrt by using ** 0.5 (#131358)" 2024-07-26 17:35:27 +00:00
adamax.py Revert "[BE] typing for decorators - optim/optimizer (#131583)" 2024-07-26 13:41:22 +00:00
adamw.py Revert "[BE] remove unnecessary _dispatch_sqrt by using ** 0.5 (#131358)" 2024-07-26 17:35:27 +00:00
asgd.py Revert "[BE] typing for decorators - optim/optimizer (#131583)" 2024-07-26 13:41:22 +00:00
lbfgs.py [Optim] Support tensor lr for all optimizers and check it is 1-element (#131065) 2024-07-23 04:27:05 +00:00
lr_scheduler.py
nadam.py Revert "[BE] remove unnecessary _dispatch_sqrt by using ** 0.5 (#131358)" 2024-07-26 17:35:27 +00:00
optimizer.py Revert "[BE] remove unnecessary _dispatch_sqrt by using ** 0.5 (#131358)" 2024-07-26 17:35:27 +00:00
radam.py Revert "[BE] remove unnecessary _dispatch_sqrt by using ** 0.5 (#131358)" 2024-07-26 17:35:27 +00:00
rmsprop.py Revert "[BE] typing for decorators - optim/optimizer (#131583)" 2024-07-26 13:41:22 +00:00
rprop.py Revert "[BE] typing for decorators - optim/optimizer (#131583)" 2024-07-26 13:41:22 +00:00
sgd.py [Optim] Support tensor lr for all optimizers and check it is 1-element (#131065) 2024-07-23 04:27:05 +00:00
sparse_adam.py [Optim] Support tensor lr for all optimizers and check it is 1-element (#131065) 2024-07-23 04:27:05 +00:00
swa_utils.py