mirror of
https://github.com/saymrwulf/pytorch.git
synced 2026-05-14 20:57:59 +00:00
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/57409 Full design: https://github.com/pytorch/pytorch/issues/55207 In https://github.com/pytorch/pytorch/issues/55207, we proposed `MeshShardingSpec` as a generic sharding mechanism. However, that proposal does not provide the flexibility to specify shards which have uneven sizes/partitions and assumes even partitioning. Uneven partitioning is one of the requirements of an internal use case. As a result, instead of that we introduce a `GenericShardingSpec` which allows specifying any arbitrary partitioning of a multi dimensional tensor. Basically it specifies the start offsets of each shard and the length of each dim of the shard allowing for greater flexibility ghstack-source-id: 129604155 Test Plan: 1) unit tests 2) waitforbuildbot Reviewed By: SciPioneer Differential Revision: D28137616 fbshipit-source-id: 61255762485fb8fa3ec3a43c27bbb222ca25abff |
||
|---|---|---|
| .. | ||
| _sharding_spec | ||
| algorithms/ddp_comm_hooks | ||
| bin | ||
| elastic | ||
| launcher | ||
| nn/jit | ||
| optim | ||
| pipeline/sync | ||
| rpc | ||
| argparse_util_test.py | ||
| test_c10d_common.py | ||
| test_c10d_gloo.py | ||
| test_c10d_nccl.py | ||
| test_c10d_spawn.py | ||
| test_c10d_spawn_gloo.py | ||
| test_c10d_spawn_nccl.py | ||
| test_data_parallel.py | ||
| test_distributed_fork.py | ||
| test_distributed_spawn.py | ||
| test_jit_c10d.py | ||
| test_launcher.py | ||
| test_nccl.py | ||