Testing bigger LLM config, referencing 'Attention Is All You Need' (transformers)

This commit is contained in:
saymrwulf 2016-06-17 23:46:00 +02:00
parent 1e875041ea
commit c87381a8d7

View file

@ -54,3 +54,4 @@
2016-06-03T07:45:00 - Quick fix, referencing a known issue from the official repo (transformers)
2016-06-03T23:56:00 - Refactor for clarity, might break a few tests though (transformers)
2016-06-03T00:54:00 - Implementing approach from a new paper read last night (transformers)
2016-06-17T23:46:00 - Testing bigger LLM config, referencing 'Attention Is All You Need' (transformers)