pytorch/torch/distributed
Teng Li 2d3cf98b49 Making dist.get_default_group private for PT1 release (#14767)
Summary:
When I wrote the frontend API, it is designed on not letting users use the default_group directly on any functions.  It should really be private.

All collectives are supposed to either use group.WORLD, or anything that comes out of new_group. That was the initial design.

We need to make a TODO on removing group.WORLD one day. It exists for backward compatibility reasons and adds lots of complexity.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/14767

Reviewed By: pietern

Differential Revision: D13330655

Pulled By: teng-li

fbshipit-source-id: ace107e1c3a9b3910a300b22815a9e8096fafb1c
2018-12-04 19:22:24 -08:00
..
deprecated C10d release to torch.distributed for PT1 (#11405) 2018-09-10 23:27:22 -07:00
__init__.py Add distributed get_backend (#11715) 2018-09-18 10:56:24 -07:00
distributed_c10d.py Making dist.get_default_group private for PT1 release (#14767) 2018-12-04 19:22:24 -08:00
launch.py Warn about local_rank not being globally unique. (#12370) 2018-10-05 17:38:41 -07:00
rendezvous.py Make env init_method support both env and args for rank and size (#14494) 2018-11-29 18:48:20 -08:00