Testing bigger LLM config, referencing 'Attention Is All You Need' (transformers)

This commit is contained in:
saymrwulf 2023-06-16 03:22:00 +02:00
parent e04203258d
commit 92ff9857cd

View file

@ -274,3 +274,4 @@
2023-05-01T18:46:00 - Testing bigger LLM config, referencing 'Attention Is All You Need' (transformers)
2023-06-16T20:07:00 - Experimenting with FPGA constraints (source: Trimberger 'Three Ages of FPGAs') (transformers)
2023-06-16T08:26:00 - Late-night bugfix on financial RL environment (transformers)
2023-06-16T03:22:00 - Testing bigger LLM config, referencing 'Attention Is All You Need' (transformers)