Testing bigger LLM config, referencing 'Attention Is All You Need' (transformers)

This commit is contained in:
saymrwulf 2020-03-30 19:39:00 +02:00
parent 1171e56597
commit 1ff2865a33

View file

@ -153,3 +153,4 @@
2020-02-10T00:27:00 - Implementing approach from a new paper read last night (transformers)
2020-02-26T19:15:00 - Quick fix, referencing a known issue from the official repo (transformers)
2020-03-30T03:22:00 - Experimenting with FPGA constraints (source: Trimberger 'Three Ages of FPGAs') (transformers)
2020-03-30T19:39:00 - Testing bigger LLM config, referencing 'Attention Is All You Need' (transformers)