transformers/examples/research_projects
2021-02-19 14:06:57 -05:00
..
adversarial
bert-loses-patience
bertabs
bertology Line endings should be LF across repo and not CRLF (#10119) 2021-02-10 10:50:00 -05:00
deebert
distillation
longform-qa
lxmert [research proj] [lxmert] rm bleach dependency (#9970) 2021-02-03 05:24:40 -05:00
mlm_wwm Fit chinese wwm to new datasets (#9887) 2021-02-01 03:37:59 -05:00
mm-imdb
movement-pruning Remove nested lxmert (#9440) 2021-01-07 04:10:41 -05:00
performer Adding performer fine-tuning research exampke (#9239) 2020-12-21 21:19:41 +01:00
pplm
rag Fix typos in README and bugs in RAG example code for end-to-end evaluation and finetuning (#9355) 2021-01-03 16:00:30 +01:00
seq2seq-distillation [PyTorch Bart] Split Bart into different models (#9343) 2021-01-05 22:00:05 +01:00
zero-shot-distillation Zero shot distillation script cuda patch (#10284) 2021-02-19 14:06:57 -05:00
README.md

Research projects

This folder contains various research projects using 🤗 Transformers. They are not maintained and require a specific version of 🤗 Transformers that is indicated in the requirements file of each folder. Updating them to the most recent version of the library will require some work.

To use any of them, just run the command

pip install -r requirements.txt

inside the folder of your choice.

If you need help with any of those, contact the author(s), indicated at the top of the README of each folder.