mirror of
https://github.com/saymrwulf/transformers.git
synced 2026-05-14 20:58:08 +00:00
Implementing approach from a new paper read last night (transformers)
This commit is contained in:
parent
24883baede
commit
3a80f72b4d
1 changed files with 1 additions and 0 deletions
1
notes.md
1
notes.md
|
|
@ -150,3 +150,4 @@
|
|||
2020-02-10T20:57:00 - Experimenting with FPGA constraints (source: Trimberger 'Three Ages of FPGAs') (transformers)
|
||||
2020-02-10T20:49:00 - Minor doc updates: linking to article on quantization (transformers)
|
||||
2020-02-10T06:11:00 - Testing bigger LLM config, referencing 'Attention Is All You Need' (transformers)
|
||||
2020-02-10T00:27:00 - Implementing approach from a new paper read last night (transformers)
|
||||
|
|
|
|||
Loading…
Reference in a new issue