mirror of
https://github.com/saymrwulf/transformers.git
synced 2026-05-15 21:01:19 +00:00
* fix: handle padding in contrastive search for decoder-only models * fix: handle padding in contrastive search for encoder-decoder models * tests: move padding contrastive test to test_util, add t5 test * fix: handle if model_kwargs["decoder_attention_mask"] is None * refactor: improve padding input contrastive search generation tests * chore: _ranking_fast to use LongTensor for cosine_matrix_mask |
||
|---|---|---|
| .. | ||
| __init__.py | ||
| test_beam_constraints.py | ||
| test_beam_search.py | ||
| test_configuration_utils.py | ||
| test_flax_logits_process.py | ||
| test_flax_utils.py | ||
| test_framework_agnostic.py | ||
| test_logits_process.py | ||
| test_stopping_criteria.py | ||
| test_streamers.py | ||
| test_tf_logits_process.py | ||
| test_tf_utils.py | ||
| test_utils.py | ||