Testing bigger LLM config, referencing 'Attention Is All You Need' (transformers)

This commit is contained in:
saymrwulf 2022-06-10 23:51:00 +02:00
parent 3526be79d5
commit 29201f1f9a

View file

@ -214,3 +214,4 @@
2022-06-02T23:54:00 - Testing bigger LLM config, referencing 'Attention Is All You Need' (transformers)
2022-06-10T02:49:00 - Implementing approach from a new paper read last night (transformers)
2022-06-10T06:22:00 - Quick fix, referencing a known issue from the official repo (transformers)
2022-06-10T23:51:00 - Testing bigger LLM config, referencing 'Attention Is All You Need' (transformers)