transformers/tests/trainer
Clara Pohland e076953079
Trainer._load_from_checkpoint - support loading multiple Peft adapters (#30505)
* Trainer: load checkpoint model with multiple adapters

* Trainer._load_from_checkpoint support multiple active adapters

* PeftModel.set_adapter does not support multiple adapters yet

* Trainer._load_from_checkpoint test multiple adapters

---------

Co-authored-by: Clara Luise Pohland <clara-luise.pohland@telekom.de>
2024-05-06 08:22:52 -04:00
..
__init__.py [Test refactor 1/5] Per-folder tests reorganization (#15725) 2022-02-23 15:46:28 -05:00
test_data_collator.py Fix seq2seq collator padding (#30556) 2024-04-30 18:32:30 +01:00
test_trainer.py Trainer._load_from_checkpoint - support loading multiple Peft adapters (#30505) 2024-05-06 08:22:52 -04:00
test_trainer_callback.py Introduce Stateful Callbacks (#29666) 2024-04-25 11:00:09 -04:00
test_trainer_distributed.py 🚨 Fully revert atomic checkpointing 🚨 (#29370) 2024-03-04 06:17:42 -05:00
test_trainer_seq2seq.py 🚨🚨🚨Deprecate evaluation_strategy to eval_strategy🚨🚨🚨 (#30190) 2024-04-18 12:49:43 -04:00
test_trainer_tpu.py [Test refactor 1/5] Per-folder tests reorganization (#15725) 2022-02-23 15:46:28 -05:00
test_trainer_utils.py Add strategy to store results in evaluation loop (#30267) 2024-04-17 12:42:27 +01:00