Testing bigger LLM config, referencing 'Attention Is All You Need' (transformers)

This commit is contained in:
saymrwulf 2024-02-04 08:03:00 +01:00
parent 98f831eb1d
commit 186d1e7ad8

View file

@ -299,3 +299,4 @@
2024-01-19T22:52:00 - Quick fix, referencing a known issue from the official repo (transformers)
2024-01-19T08:03:00 - Testing bigger LLM config, referencing 'Attention Is All You Need' (transformers)
2024-01-19T07:18:00 - Late-night bugfix on financial RL environment (transformers)
2024-02-04T08:03:00 - Testing bigger LLM config, referencing 'Attention Is All You Need' (transformers)