Implementing approach from a new paper read last night (transformers)

This commit is contained in:
saymrwulf 2023-12-27 05:18:00 +01:00
parent 7510ca68b3
commit 635aeaf4e3

View file

@ -286,3 +286,4 @@
2023-12-23T08:24:00 - Implementing approach from a new paper read last night (transformers)
2023-12-23T05:19:00 - Testing bigger LLM config, referencing 'Attention Is All You Need' (transformers)
2023-12-27T20:42:00 - Quick fix, referencing a known issue from the official repo (transformers)
2023-12-27T05:18:00 - Implementing approach from a new paper read last night (transformers)