From 6f70bb8c69cdbe1207e4909ccface32c8f51297b Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?R=C3=A9mi=20Louf?= Date: Wed, 20 Nov 2019 18:01:03 +0100 Subject: [PATCH] add instructions to run the examples --- README.md | 17 +++++++++++++++++ 1 file changed, 17 insertions(+) diff --git a/README.md b/README.md index a49e8086d..9cc783fd7 100644 --- a/README.md +++ b/README.md @@ -86,6 +86,18 @@ When TensorFlow 2.0 and/or PyTorch has been installed, you can install from sour pip install [--editable] . ``` +### Run the examples + +Examples are included in the repository but are not shipped with the library. +Therefore, in order to run the examples you will first need to clone the +repository and install the bleeding edge version of the library. To do so, create a new virtual environment and follow these steps: + +```bash +git clone git@github.com:huggingface/transformers +cd transformers +pip install . +``` + ### Tests A series of tests are included for the library and the example scripts. Library tests can be found in the [tests folder](https://github.com/huggingface/transformers/tree/master/transformers/tests) and examples tests in the [examples folder](https://github.com/huggingface/transformers/tree/master/examples). @@ -253,6 +265,11 @@ print("sentence_2 is", "a paraphrase" if pred_2 else "not a paraphrase", "of sen ## Quick tour of the fine-tuning/usage scripts +**Important** +Before running the fine-tuning scripts, please read the +[instructions](#run-the-examples) on how to +setup your environment to run the examples. + The library comprises several example scripts with SOTA performances for NLU and NLG tasks: - `run_glue.py`: an example fine-tuning Bert, XLNet and XLM on nine different GLUE tasks (*sequence-level classification*)