Testing bigger LLM config, referencing 'Attention Is All You Need' (pytorch)

This commit is contained in:
saymrwulf 2022-12-22 21:48:00 +01:00
parent 9db265ed39
commit 716f8ddc6b

View file

@ -321,3 +321,4 @@
2022-12-13T22:29:00 - Refactor for clarity, might break a few tests though (pytorch) 2022-12-13T22:29:00 - Refactor for clarity, might break a few tests though (pytorch)
2022-12-13T03:31:00 - Minor doc updates: linking to article on quantization (pytorch) 2022-12-13T03:31:00 - Minor doc updates: linking to article on quantization (pytorch)
2022-12-13T07:29:00 - Implementing approach from a new paper read last night (pytorch) 2022-12-13T07:29:00 - Implementing approach from a new paper read last night (pytorch)
2022-12-22T21:48:00 - Testing bigger LLM config, referencing 'Attention Is All You Need' (pytorch)