pytorch/torch/distributed
Mingzhe Li 59083d6176 [NCCL] Support NCCL Send/Recv (#44921)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/44921

This diff adds support for Process Group point-to-point operations on NCCL backend based on ncclSend/ncclRecv. See https://github.com/pytorch/pytorch/issues/43995 for more context.
ghstack-source-id: 113592785

Test Plan: unittest

Reviewed By: jiayisuse

Differential Revision: D23709848

fbshipit-source-id: cdf38050379ecbb10450f3394631317b41163258
2020-10-05 18:27:57 -07:00
..
algorithms/ddp_comm_hooks Make ddp_comm_hook_wrapper a private method. (#44643) 2020-09-24 13:29:48 -07:00
autograd Remove py2 compatible future imports (#44735) 2020-09-16 12:55:57 -07:00
nn Add a device parameter to RemoteModule (#44254) 2020-09-18 10:31:03 -07:00
optim [dist_optim] introduce distributed functional optimizer (#45221) 2020-09-25 17:13:10 -07:00
rpc Revert "Remove device maps from TensorPipe for v1.7 release (#45353)" (#45762) 2020-10-02 15:14:05 -07:00
__init__.py [NCCL] Support NCCL Send/Recv (#44921) 2020-10-05 18:27:57 -07:00
constants.py
CONTRIBUTING.md Fixing a few links in distributed CONTRIBUTING.md (#44753) 2020-09-16 10:14:19 -07:00
distributed_c10d.py [NCCL] Support NCCL Send/Recv (#44921) 2020-10-05 18:27:57 -07:00
launch.py Remove py2 compatible future imports (#44735) 2020-09-16 12:55:57 -07:00
rendezvous.py Fix Windows build failure after DDP PR merged (#45335) 2020-09-25 12:37:50 -07:00