From dfa4c26bc0fa55be3f8c3903e91257aca4bc76d7 Mon Sep 17 00:00:00 2001 From: Katarina Slama Date: Thu, 15 Oct 2020 16:36:31 -0700 Subject: [PATCH] Typo and fix the input of labels to `cross_entropy` (#7841) The current version caused some errors. The changes fixed it for me. Hope this is helpful! --- docs/source/training.rst | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/source/training.rst b/docs/source/training.rst index 9a3e51058..524818b60 100644 --- a/docs/source/training.rst +++ b/docs/source/training.rst @@ -109,9 +109,9 @@ The following is equivalent to the previous example: .. code-block:: python from torch.nn import functional as F - labels = torch.tensor([1,0]).unsqueeze(0) + labels = torch.tensor([1,0]) outputs = model(input_ids, attention_mask=attention_mask) - loss = F.cross_entropy(labels, outputs.logitd) + loss = F.cross_entropy(outputs.logits, labels) loss.backward() optimizer.step()