mirror of
https://github.com/saymrwulf/transformers.git
synced 2026-05-14 20:58:08 +00:00
parent
94056b57be
commit
188a8bfccc
2 changed files with 2 additions and 2 deletions
|
|
@ -678,7 +678,7 @@ model.save_pretrained("/path/to/converted/checkpoint/folder")
|
|||
**7. Implement the forward pass**
|
||||
|
||||
Having managed to correctly load the pretrained weights into the 🤗 Transformers implementation, you should now make
|
||||
sure that the forward pass is correctly implemented. In [Get familiar with the original repository](#run-a-pretrained-checkpoint-using-the-original-repository), you have already created a script that runs a forward
|
||||
sure that the forward pass is correctly implemented. In [Get familiar with the original repository](#34-run-a-pretrained-checkpoint-using-the-original-repository), you have already created a script that runs a forward
|
||||
pass of the model using the original repository. Now you should write an analogous script using the 🤗 Transformers
|
||||
implementation instead of the original one. It should look as follows:
|
||||
|
||||
|
|
|
|||
|
|
@ -848,7 +848,7 @@ model.save_pretrained("/path/to/converted/checkpoint/folder")
|
|||
Having managed to correctly load the pretrained weights into the 🤗
|
||||
Transformers implementation, you should now make sure that the forward
|
||||
pass is correctly implemented. In [Get familiar with the original
|
||||
repository](#run-a-pretrained-checkpoint-using-the-original-repository),
|
||||
repository](#34-run-a-pretrained-checkpoint-using-the-original-repository),
|
||||
you have already created a script that runs a forward pass of the model
|
||||
using the original repository. Now you should write an analogous script
|
||||
using the 🤗 Transformers implementation instead of the original one. It
|
||||
|
|
|
|||
Loading…
Reference in a new issue