mirror of
https://github.com/saymrwulf/transformers.git
synced 2026-05-14 20:58:08 +00:00
Fix typo
This commit is contained in:
parent
6e261d3a22
commit
61a2b7dc9d
1 changed files with 1 additions and 1 deletions
|
|
@ -7,7 +7,7 @@ thumbnail: https://i.imgur.com/jgBdimh.png
|
|||
|
||||
This model is a fine-tuned on [SQuAD-es-v2.0](https://github.com/ccasimiro88/TranslateAlignRetrieve) and **distilled** version of [BETO](https://github.com/dccuchile/beto) for **Q&A**.
|
||||
|
||||
Distillation makes the model smaller, fasert, cheaper and lighter than [bert-base-spanish-wwm-cased-finetuned-spa-squad2-es](https://github.com/huggingface/transformers/blob/master/model_cards/mrm8488/bert-base-spanish-wwm-cased-finetuned-spa-squad2-es/README.md)
|
||||
Distillation makes the model **smaller, faster, cheaper and lighter** than [bert-base-spanish-wwm-cased-finetuned-spa-squad2-es](https://github.com/huggingface/transformers/blob/master/model_cards/mrm8488/bert-base-spanish-wwm-cased-finetuned-spa-squad2-es/README.md)
|
||||
|
||||
This model was fine-tuned on the same dataset but using **distillation** during the process as mentioned above (and one more train epoch).
|
||||
|
||||
|
|
|
|||
Loading…
Reference in a new issue