mirror of
https://github.com/saymrwulf/transformers.git
synced 2026-05-14 20:58:08 +00:00
Update README.md (#8406)
This commit is contained in:
parent
7247d0b4ea
commit
507dfb40c3
1 changed files with 2 additions and 2 deletions
|
|
@ -3,9 +3,9 @@ language:
|
|||
- ru
|
||||
- en
|
||||
---
|
||||
## RuDR-BERT
|
||||
## EnRuDR-BERT
|
||||
|
||||
EnRuDR-BERT - Multilingual, Cased, which pretrained on the raw part of the RuDReC corpus (1.4M reviews) and collecting of consumer comments on drug administration from [2]. Pre-training was based on the [original BERT code](https://github.com/google-research/bert) provided by Google. In particular, Multi-BERT was for used for initialization; vocabulary of Russian subtokens and parameters are the same as in Multi-BERT. Training details are described in our paper. \
|
||||
EnRuDR-BERT - Multilingual, Cased, which pretrained on the raw part of the RuDReC corpus (1.4M reviews) and english collection of consumer comments on drug administration from [2]. Pre-training was based on the [original BERT code](https://github.com/google-research/bert) provided by Google. In particular, Multi-BERT was for used for initialization; vocabulary of Russian subtokens and parameters are the same as in Multi-BERT. Training details are described in our paper. \
|
||||
link: https://yadi.sk/d/-PTn0xhk1PqvgQ
|
||||
|
||||
|
||||
|
|
|
|||
Loading…
Reference in a new issue