pytorch/torch/csrc/distributed
Rohan Varma c3f2f3294e [RPC] Add option to make rref.get_type not block. (#50977)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/50977

Adds a `blocking` flag that can be set to False to make this API return a `Future` to the type. This is to make this function non-blocking, mostly for a future change that will allow `rref.rpc_async()` to be completely non-blocking (it currently calls and waits for this function that issues an RPC in-line).
ghstack-source-id: 121021433

Test Plan: Modified UT

Reviewed By: mrshenli

Differential Revision: D25944582

fbshipit-source-id: e3b48a52af2d4578551a30ba6838927b489b1c03
2021-02-04 20:18:50 -08:00
..
autograd Support device map for distributed autograd while using TensorPipe. (#44859) 2021-01-27 13:01:44 -08:00
c10d Revert D26237328: Add compare_set operation and test to TCPStore 2021-02-04 13:43:05 -08:00
rpc [RPC] Add option to make rref.get_type not block. (#50977) 2021-02-04 20:18:50 -08:00