pytorch/torch/nn
Yi Wang 07653b7fe0 [SPMD] Remove ddp_gpu_size field from SyncBatchNorm (#55946)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/55946

As `ddp_gpu_size` field of `SyncBatchNorm` will always be 1 for GPU modules, remove this field and the relevant code.
ghstack-source-id: 126883498

Test Plan: waitforbuildbot

Reviewed By: zhaojuanmao

Differential Revision: D27746021

fbshipit-source-id: b4518c07e6f0c6943fbd7a7548500a7d4337126c
2021-04-19 21:41:29 -07:00
..
backends
intrinsic Un-ignore F403 in .flake8 (#55838) 2021-04-13 09:24:07 -07:00
modules [SPMD] Remove ddp_gpu_size field from SyncBatchNorm (#55946) 2021-04-19 21:41:29 -07:00
parallel [SPMD] Remove ddp_gpu_size field from SyncBatchNorm (#55946) 2021-04-19 21:41:29 -07:00
qat Revert D27855386: [pytorch][PR] Support factory kwargs in torch.nn modules 2021-04-19 20:07:20 -07:00
quantizable Revert D27855386: [pytorch][PR] Support factory kwargs in torch.nn modules 2021-04-19 20:07:20 -07:00
quantized Revert D27855386: [pytorch][PR] Support factory kwargs in torch.nn modules 2021-04-19 20:07:20 -07:00
utils Add lint for unqualified noqa (#56272) 2021-04-19 13:16:18 -07:00
__init__.py Revert D27855386: [pytorch][PR] Support factory kwargs in torch.nn modules 2021-04-19 20:07:20 -07:00
_reduction.py
common_types.py
cpp.py
functional.py Add lint for unqualified noqa (#56272) 2021-04-19 13:16:18 -07:00
functional.pyi.in Add padding_idx argument to EmbeddingBag (#49237) 2021-04-14 09:38:01 -07:00
grad.py
init.py Add doc warnings for default SELU gain (#54057) 2021-03-25 11:21:02 -07:00
parameter.py Revert D27855386: [pytorch][PR] Support factory kwargs in torch.nn modules 2021-04-19 20:07:20 -07:00
parameter.pyi Forbid trailing whitespace (#53406) 2021-03-05 17:22:55 -08:00