mirror of
https://github.com/saymrwulf/transformers.git
synced 2026-05-15 21:01:19 +00:00
* Important files * Styling them all * Revert "Styling them all" This reverts commit 7d029395fdae8513b8281cbc2a6c239f8093503e. * Syling them for realsies * Fix syntax error * Fix benchmark_utils * More fixes * Fix modeling auto and script * Remove new line * Fixes * More fixes * Fix more files * Style * Add FSMT * More fixes * More fixes * More fixes * More fixes * Fixes * More fixes * More fixes * Last fixes * Make sphinx happy
30 lines
1.5 KiB
ReStructuredText
30 lines
1.5 KiB
ReStructuredText
Encoder Decoder Models
|
|
-----------------------------------------------------------------------------------------------------------------------
|
|
|
|
The :class:`~transformers.EncoderDecoderModel` can be used to initialize a sequence-to-sequence model with any
|
|
pretrained autoencoding model as the encoder and any pretrained autoregressive model as the decoder.
|
|
|
|
The effectiveness of initializing sequence-to-sequence models with pretrained checkpoints for sequence generation tasks
|
|
was shown in `Leveraging Pre-trained Checkpoints for Sequence Generation Tasks <https://arxiv.org/abs/1907.12461>`__ by
|
|
Sascha Rothe, Shashi Narayan, Aliaksei Severyn.
|
|
|
|
After such an :class:`~transformers.EncoderDecoderModel` has been trained/fine-tuned, it can be saved/loaded just like
|
|
any other models (see the examples for more information).
|
|
|
|
An application of this architecture could be to leverage two pretrained :class:`~transformers.BertModel` as the encoder
|
|
and decoder for a summarization model as was shown in: `Text Summarization with Pretrained Encoders
|
|
<https://arxiv.org/abs/1908.08345>`__ by Yang Liu and Mirella Lapata.
|
|
|
|
|
|
EncoderDecoderConfig
|
|
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
|
|
|
.. autoclass:: transformers.EncoderDecoderConfig
|
|
:members:
|
|
|
|
|
|
EncoderDecoderModel
|
|
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
|
|
|
.. autoclass:: transformers.EncoderDecoderModel
|
|
:members: forward, from_encoder_decoder_pretrained
|