mirror of
https://github.com/saymrwulf/transformers.git
synced 2026-05-14 20:58:08 +00:00
add docstring example for compute_loss_func (#35020)
This commit is contained in:
parent
31299670cd
commit
f0dec874f0
1 changed files with 1 additions and 2 deletions
|
|
@ -360,8 +360,7 @@ class Trainer:
|
|||
inner layers, dropout probabilities etc).
|
||||
compute_loss_func (`Callable`, *optional*):
|
||||
A function that accepts the raw model outputs, labels, and the number of items in the entire accumulated
|
||||
batch (batch_size * gradient_accumulation_steps) and returns the loss. For example, here is one using
|
||||
the loss function from `transformers`
|
||||
batch (batch_size * gradient_accumulation_steps) and returns the loss. For example, see the default [loss function](https://github.com/huggingface/transformers/blob/052e652d6d53c2b26ffde87e039b723949a53493/src/transformers/trainer.py#L3618) used by [`Trainer`].
|
||||
compute_metrics (`Callable[[EvalPrediction], Dict]`, *optional*):
|
||||
The function that will be used to compute metrics at evaluation. Must take a [`EvalPrediction`] and return
|
||||
a dictionary string to metric values. *Note* When passing TrainingArgs with `batch_eval_metrics` set to
|
||||
|
|
|
|||
Loading…
Reference in a new issue