mirror of
https://github.com/saymrwulf/transformers.git
synced 2026-05-14 20:58:08 +00:00
* bookmark * Bookmark * Bookmark * Actually implement * Pass in kwarg explicitly * Adjust for if we do or don't have labels * Bookmark fix for od * bookmark * Fin * closer * Negate accelerate grad accum div * Fixup not training long enough * Add in compute_loss to take full model output * Document * compute_loss -> compute_loss_fn * Add a test * Refactor * Refactor * Uncomment tests * Update tests/trainer/test_trainer.py Co-authored-by: Daniel Han <danielhanchen@gmail.com> --------- Co-authored-by: Daniel Han <danielhanchen@gmail.com> |
||
|---|---|---|
| .. | ||
| __init__.py | ||
| test_data_collator.py | ||
| test_trainer.py | ||
| test_trainer_callback.py | ||
| test_trainer_distributed.py | ||
| test_trainer_fsdp.py | ||
| test_trainer_seq2seq.py | ||
| test_trainer_tpu.py | ||
| test_trainer_utils.py | ||