pytorch/test/distributed/tensor/parallel
Iris Zhang 72ce5dd13e [2D] Remove enable_2d_with_fsdp() API and make remove_enable_2d_with_fsdp private (#112473)
As we have our new 2D flow out, we want to remove `enable_2d_with_fsdp()`.
In addition, we change pre_dp_module_transform to private, as we may need to change the UX later on.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/112473
Approved by: https://github.com/fegin, https://github.com/wanchaol
2023-11-16 01:14:00 +00:00
..
__init__.py
test_ddp_2d_parallel.py [2D] Remove enable_2d_with_fsdp() API and make remove_enable_2d_with_fsdp private (#112473) 2023-11-16 01:14:00 +00:00
test_fsdp_2d_parallel.py [2D] Bind _fsdp_extension to FSDP instances (#113237) 2023-11-09 03:31:03 +00:00
test_parallelize_api.py [TP] Enable embedding sharding in TP API (#111177) 2023-10-15 11:49:56 +00:00
test_tp_examples.py [TP] Add prepareInput and output for input/output DTensor layout annotation in the parent module in TP API (#111166) 2023-10-14 15:37:52 +00:00
test_tp_random_state.py Fix typo under test directory (#112346) 2023-11-03 07:53:33 +00:00
test_tp_style.py [tp] fix PrepareModuleInput for multiple inputs (#112204) 2023-10-27 05:08:05 +00:00
test_view_sharding_dim_change.py