pytorch/test/distributed/checkpoint
Chien-Chin Huang b0871f9b33 [DSD] Add a test to verify FSDP lazy initialization case (#127069)
Summary:
Distributed state_dict should not error out because the `model.state_dict()` will trigger FSDP to initialize.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/127069
Approved by: https://github.com/wz337
2024-05-24 21:09:11 +00:00
..
e2e [DCP] overwrites existing checkpoint by default (#125877) 2024-05-15 20:12:52 +00:00
fsdp
test_checkpoint.py
test_compatibility.py [DCP] Adds storage metadata, and passes it during the save path (#124772) 2024-05-07 23:53:53 +00:00
test_dedup_tensors.py
test_dtensor_checkpoint.py
test_dtensor_resharding.py
test_file_system_checkpoint.py
test_file_system_checkpoint_cpu.py
test_format_utils.py
test_fsdp_model_state.py
test_fsdp_optim_state.py
test_fsdp_tp_checkpoint_conversion.py
test_fsspec.py [DCP] overwrites existing checkpoint by default (#125877) 2024-05-15 20:12:52 +00:00
test_hsdp_checkpoint.py
test_nested_dict.py [DCP] Always flatten mapping even if no tensors present (#125335) 2024-05-07 17:08:49 +00:00
test_planner.py
test_save_load_api.py
test_state_dict.py [DSD] Add a test to verify FSDP lazy initialization case (#127069) 2024-05-24 21:09:11 +00:00
test_state_dict_utils.py
test_tp_checkpoint.py
test_traverse.py [DCP] Always flatten mapping even if no tensors present (#125335) 2024-05-07 17:08:49 +00:00
test_utils.py