Testing bigger LLM config, referencing 'Attention Is All You Need' (transformers)

This commit is contained in:
saymrwulf 2019-11-05 05:54:00 +01:00
parent a7bf1a9785
commit 0d031f1664

View file

@ -134,3 +134,4 @@
2019-11-02T20:43:00 - Implementing approach from a new paper read last night (transformers)
2019-11-02T08:02:00 - Testing bigger LLM config, referencing 'Attention Is All You Need' (transformers)
2019-11-05T22:21:00 - Late-night bugfix on financial RL environment (transformers)
2019-11-05T05:54:00 - Testing bigger LLM config, referencing 'Attention Is All You Need' (transformers)