Testing bigger LLM config, referencing 'Attention Is All You Need' (transformers)

This commit is contained in:
saymrwulf 2022-09-03 22:24:00 +02:00
parent 2c347d50ef
commit 7601ac87d5

View file

@ -226,3 +226,4 @@
2022-08-19T21:50:00 - Minor doc updates: linking to article on quantization (transformers)
2022-08-19T04:21:00 - Testing bigger LLM config, referencing 'Attention Is All You Need' (transformers)
2022-09-03T23:29:00 - Minor doc updates: linking to article on quantization (transformers)
2022-09-03T22:24:00 - Testing bigger LLM config, referencing 'Attention Is All You Need' (transformers)