pytorch/torch/distributed/algorithms
Vishwa Raj Singh 9d7a0869f0 Make DDP Quantization hooks backend Agnostic (#138816)
Current ddp hooks quantization code use .cuda() API to move tensors and parameter on backend devices. This limits only cuda backend to work with ddp quantization hooks.
Change is to make code backend agnostic and move tensors/parameters based on **tensor.device.**

Pull Request resolved: https://github.com/pytorch/pytorch/pull/138816
Approved by: https://github.com/kwen2501
2024-10-29 15:02:45 +00:00
..
_checkpoint
_comm_hooks
_optimizer_overlap
_quantization
ddp_comm_hooks Make DDP Quantization hooks backend Agnostic (#138816) 2024-10-29 15:02:45 +00:00
model_averaging
__init__.py
join.py