mirror of
https://github.com/saymrwulf/transformers.git
synced 2026-05-14 20:58:08 +00:00
* [WIP] Add FLAVA model This PR aims to add [FLAVA](ihttps://arxiv.org/abs/2112.04482) model to the transformers repo. Following checklist delineates the list of things to be done for this PR to be complete: [x] Flava init [x] Flava base models [x] Flava layers [x] Flava Configs [x] Flava encoders [x] Flava pretraining models [ ] Flava classification/retrieval models (To be added in a separate PR) [x] Documentation updates [x] Imports updates [x] Argstring updates [x] Flava pretrained checkpoints [x] Flava tests [x] Flava processors [x] Sanity check [x] Lint |
||
|---|---|---|
| .. | ||
| benchmark | ||
| deepspeed | ||
| extended | ||
| fixtures | ||
| generation | ||
| models | ||
| onnx | ||
| optimization | ||
| pipelines | ||
| sagemaker | ||
| tokenization | ||
| trainer | ||
| utils | ||
| __init__.py | ||
| test_configuration_common.py | ||
| test_feature_extraction_common.py | ||
| test_modeling_common.py | ||
| test_modeling_flax_common.py | ||
| test_modeling_tf_common.py | ||
| test_sequence_feature_extraction_common.py | ||
| test_tokenization_common.py | ||