add docstring example for compute_loss_func (#35020)

This commit is contained in:
secrettoad 2024-12-02 11:39:09 -08:00 committed by GitHub
parent 31299670cd
commit f0dec874f0
No known key found for this signature in database
GPG key ID: B5690EEEBB952194

View file

@ -360,8 +360,7 @@ class Trainer:
inner layers, dropout probabilities etc).
compute_loss_func (`Callable`, *optional*):
A function that accepts the raw model outputs, labels, and the number of items in the entire accumulated
batch (batch_size * gradient_accumulation_steps) and returns the loss. For example, here is one using
the loss function from `transformers`
batch (batch_size * gradient_accumulation_steps) and returns the loss. For example, see the default [loss function](https://github.com/huggingface/transformers/blob/052e652d6d53c2b26ffde87e039b723949a53493/src/transformers/trainer.py#L3618) used by [`Trainer`].
compute_metrics (`Callable[[EvalPrediction], Dict]`, *optional*):
The function that will be used to compute metrics at evaluation. Must take a [`EvalPrediction`] and return
a dictionary string to metric values. *Note* When passing TrainingArgs with `batch_eval_metrics` set to