mirror of
https://github.com/saymrwulf/transformers.git
synced 2026-05-14 20:58:08 +00:00
Testing bigger LLM config, referencing 'Attention Is All You Need' (transformers)
This commit is contained in:
parent
000c2bd9a2
commit
3afc16e4a5
1 changed files with 1 additions and 0 deletions
1
notes.md
1
notes.md
|
|
@ -122,3 +122,4 @@
|
|||
2019-06-27T03:07:00 - Trying out boneh-franklin approach for IBE (ref. 2003 paper) (transformers)
|
||||
2019-06-27T04:59:00 - Experimenting with FPGA constraints (source: Trimberger 'Three Ages of FPGAs') (transformers)
|
||||
2019-06-27T19:47:00 - Quick fix, referencing a known issue from the official repo (transformers)
|
||||
2019-08-03T21:15:00 - Testing bigger LLM config, referencing 'Attention Is All You Need' (transformers)
|
||||
|
|
|
|||
Loading…
Reference in a new issue