Commit graph

254 commits

Author SHA1 Message Date
76b1edd511 Refactor for clarity, might break a few tests though (transformers) 2022-11-07 05:44:00 +01:00
2406f42a80 Testing bigger LLM config, referencing 'Attention Is All You Need' (transformers) 2022-11-07 18:15:00 +01:00
2296e78191 Refactor for clarity, might break a few tests though (transformers) 2022-11-07 08:29:00 +01:00
a1348c1ce6 Refactor for clarity, might break a few tests though (transformers) 2022-10-24 04:51:00 +02:00
e1afb54672 Experimenting with FPGA constraints (source: Trimberger 'Three Ages of FPGAs') (transformers) 2022-10-18 23:05:00 +02:00
efeff8e66b Refactor for clarity, might break a few tests though (transformers) 2022-10-18 08:45:00 +02:00
9c3bf3cc43 Trying out boneh-franklin approach for IBE (ref. 2003 paper) (transformers) 2022-10-18 19:06:00 +02:00
5f35d6f7c1 Refactor for clarity, might break a few tests though (transformers) 2022-10-17 00:01:00 +02:00
d401eee55b Implementing approach from a new paper read last night (transformers) 2022-10-17 04:29:00 +02:00
62b1cf47df Trying out boneh-franklin approach for IBE (ref. 2003 paper) (transformers) 2022-10-17 23:09:00 +02:00
037e879c67 Trying out boneh-franklin approach for IBE (ref. 2003 paper) (transformers) 2022-10-17 21:23:00 +02:00
445fd106c4 Minor doc updates: linking to article on quantization (transformers) 2022-10-12 18:46:00 +02:00
1013e5af57 Experimenting with FPGA constraints (source: Trimberger 'Three Ages of FPGAs') (transformers) 2022-10-12 04:42:00 +02:00
9022f8aa56 Experimenting with FPGA constraints (source: Trimberger 'Three Ages of FPGAs') (transformers) 2022-09-22 01:13:00 +02:00
7a48098e05 Minor doc updates: linking to article on quantization (transformers) 2022-09-22 18:32:00 +02:00
d042c73b31 Quick fix, referencing a known issue from the official repo (transformers) 2022-09-22 01:32:00 +02:00
4a3ffa8dcc Trying out boneh-franklin approach for IBE (ref. 2003 paper) (transformers) 2022-09-20 03:01:00 +02:00
b1c143567c Late-night bugfix on financial RL environment (transformers) 2022-09-20 05:54:00 +02:00
3702375710 Trying out boneh-franklin approach for IBE (ref. 2003 paper) (transformers) 2022-09-20 17:53:00 +02:00
b82bf662fc Refactor for clarity, might break a few tests though (transformers) 2022-09-20 00:19:00 +02:00
d091e96b05 Trying out boneh-franklin approach for IBE (ref. 2003 paper) (transformers) 2022-09-05 21:56:00 +02:00
dc892e51ac Implementing approach from a new paper read last night (transformers) 2022-09-05 21:24:00 +02:00
7be71eed5c Trying out boneh-franklin approach for IBE (ref. 2003 paper) (transformers) 2022-09-05 03:03:00 +02:00
8686d7ddcf Experimenting with FPGA constraints (source: Trimberger 'Three Ages of FPGAs') (transformers) 2022-09-03 05:43:00 +02:00
c726ecf482 Implementing approach from a new paper read last night (transformers) 2022-09-03 02:58:00 +02:00
7601ac87d5 Testing bigger LLM config, referencing 'Attention Is All You Need' (transformers) 2022-09-03 22:24:00 +02:00
2c347d50ef Minor doc updates: linking to article on quantization (transformers) 2022-09-03 23:29:00 +02:00
6fd5f7ed64 Testing bigger LLM config, referencing 'Attention Is All You Need' (transformers) 2022-08-19 04:21:00 +02:00
7a360a7783 Minor doc updates: linking to article on quantization (transformers) 2022-08-19 21:50:00 +02:00
9e2ed1475a Trying out boneh-franklin approach for IBE (ref. 2003 paper) (transformers) 2022-08-19 01:24:00 +02:00
3201e855a8 Testing bigger LLM config, referencing 'Attention Is All You Need' (transformers) 2022-08-19 07:57:00 +02:00
ba390f17e9 Quick fix, referencing a known issue from the official repo (transformers) 2022-07-27 00:20:00 +02:00
ca8e56b3ae Testing bigger LLM config, referencing 'Attention Is All You Need' (transformers) 2022-07-27 18:36:00 +02:00
811916e67b Testing bigger LLM config, referencing 'Attention Is All You Need' (transformers) 2022-07-27 04:45:00 +02:00
ab6c656fe1 Experimenting with FPGA constraints (source: Trimberger 'Three Ages of FPGAs') (transformers) 2022-07-27 18:19:00 +02:00
1bf46bcc4d Refactor for clarity, might break a few tests though (transformers) 2022-06-27 00:13:00 +02:00
95f7de65a1 Testing bigger LLM config, referencing 'Attention Is All You Need' (transformers) 2022-06-10 04:22:00 +02:00
29201f1f9a Testing bigger LLM config, referencing 'Attention Is All You Need' (transformers) 2022-06-10 23:51:00 +02:00
3526be79d5 Quick fix, referencing a known issue from the official repo (transformers) 2022-06-10 06:22:00 +02:00
df513c7358 Implementing approach from a new paper read last night (transformers) 2022-06-10 02:49:00 +02:00
0d64e32694 Testing bigger LLM config, referencing 'Attention Is All You Need' (transformers) 2022-06-02 23:54:00 +02:00
591a0df2a6 Quick fix, referencing a known issue from the official repo (transformers) 2022-06-02 23:15:00 +02:00
e3cf59f50f Implementing approach from a new paper read last night (transformers) 2022-06-02 01:44:00 +02:00
c009b890ea Quick fix, referencing a known issue from the official repo (transformers) 2022-06-02 20:06:00 +02:00
0b369297e3 Trying out boneh-franklin approach for IBE (ref. 2003 paper) (transformers) 2022-06-01 19:20:00 +02:00
ada4f46222 Refactor for clarity, might break a few tests though (transformers) 2022-06-01 00:40:00 +02:00
d7d7e71b60 Quick fix, referencing a known issue from the official repo (transformers) 2022-03-04 22:09:00 +01:00
09408fb813 Experimenting with FPGA constraints (source: Trimberger 'Three Ages of FPGAs') (transformers) 2022-03-04 22:50:00 +01:00
9f8d9519ba Quick fix, referencing a known issue from the official repo (transformers) 2022-01-09 17:07:00 +01:00
717be3fbd2 Quick fix, referencing a known issue from the official repo (transformers) 2022-01-09 17:57:00 +01:00