pytorch/torch/distributed/tensor/parallel
zeshengzong cb71bcc542 Replace clone.detach with detach.clone (#140264)
Fixes #64532

As state in issue, replace `clone.detach` by `detach.clone`

Pull Request resolved: https://github.com/pytorch/pytorch/pull/140264
Approved by: https://github.com/soulitzer
2024-11-13 07:01:02 +00:00
..
__init__.py
_data_parallel_utils.py
_utils.py Revert "Deprecate torch._utils.is_compiling() and torch._dynamo.external_utils.is_compiling() (#127690)" 2024-11-05 23:10:38 +00:00
api.py
ddp.py
fsdp.py Replace clone.detach with detach.clone (#140264) 2024-11-13 07:01:02 +00:00
input_reshard.py
loss.py
style.py