mirror of
https://github.com/saymrwulf/transformers.git
synced 2026-05-14 20:58:08 +00:00
add instructions to run the examples
This commit is contained in:
parent
0cdfcca24b
commit
6f70bb8c69
1 changed files with 17 additions and 0 deletions
17
README.md
17
README.md
|
|
@ -86,6 +86,18 @@ When TensorFlow 2.0 and/or PyTorch has been installed, you can install from sour
|
|||
pip install [--editable] .
|
||||
```
|
||||
|
||||
### Run the examples
|
||||
|
||||
Examples are included in the repository but are not shipped with the library.
|
||||
Therefore, in order to run the examples you will first need to clone the
|
||||
repository and install the bleeding edge version of the library. To do so, create a new virtual environment and follow these steps:
|
||||
|
||||
```bash
|
||||
git clone git@github.com:huggingface/transformers
|
||||
cd transformers
|
||||
pip install .
|
||||
```
|
||||
|
||||
### Tests
|
||||
|
||||
A series of tests are included for the library and the example scripts. Library tests can be found in the [tests folder](https://github.com/huggingface/transformers/tree/master/transformers/tests) and examples tests in the [examples folder](https://github.com/huggingface/transformers/tree/master/examples).
|
||||
|
|
@ -253,6 +265,11 @@ print("sentence_2 is", "a paraphrase" if pred_2 else "not a paraphrase", "of sen
|
|||
|
||||
## Quick tour of the fine-tuning/usage scripts
|
||||
|
||||
**Important**
|
||||
Before running the fine-tuning scripts, please read the
|
||||
[instructions](#run-the-examples) on how to
|
||||
setup your environment to run the examples.
|
||||
|
||||
The library comprises several example scripts with SOTA performances for NLU and NLG tasks:
|
||||
|
||||
- `run_glue.py`: an example fine-tuning Bert, XLNet and XLM on nine different GLUE tasks (*sequence-level classification*)
|
||||
|
|
|
|||
Loading…
Reference in a new issue