mirror of
https://github.com/saymrwulf/pytorch.git
synced 2026-05-14 20:57:59 +00:00
There was missing support for bfloat scalars. When I use gloo backend `torch.distributed.init_process_group(backend='gloo')` and run `torch.nn.parallel.DistributedDataParallel(model)` and _model_ has Bfloat16 features I receive following error: `RuntimeError: Invalid scalar type` This change fix this issue. c10::BFloat16 defines conversions from/to float, so calculations are made on float for bfloat. Pull Request resolved: https://github.com/pytorch/pytorch/pull/113557 Approved by: https://github.com/XilunWu, https://github.com/jgong5 |
||
|---|---|---|
| .. | ||
| autograd | ||
| c10d | ||
| rpc | ||