pytorch/torch/distributed/_composable
Andrew Gu bd867d691b [FSDP2] Fix backward-compatible imports (#142419)
Internal only: the before way meant that `from torch.distributed._composable.fsdp import fully_shard` was importing `fully_shard.py` not the function `fully_shard`. For some reason, the resolution order is different from open source.

To fix this, we match the old import as closely as possible. Namely, we import `fully_shard.py` contents from `.fully_shard`. This should force that import to take precedence.

@diff-train-skip-merge

Differential Revision: [D66990327](https://our.internmc.facebook.com/intern/diff/D66990327)
Pull Request resolved: https://github.com/pytorch/pytorch/pull/142419
Approved by: https://github.com/weifengpy
2024-12-09 23:56:32 +00:00
..
fsdp [FSDP2] Fix backward-compatible imports (#142419) 2024-12-09 23:56:32 +00:00
__init__.py Remove old FSDP1 fully_shard (#141875) 2024-12-03 17:00:47 +00:00
checkpoint_activation.py
contract.py Fix type-safety of torch.nn.Module instances (#141240) 2024-11-22 00:05:05 +00:00
replicate.py Remove unused Python variables in torch/[b-z]* (#136963) 2024-10-19 16:45:22 +00:00