|
a4967600d0
|
More general seq forward functions for RNNs. (#1050)
|
2023-10-07 15:08:01 +01:00 |
|
|
c798184c2b
|
Configurable layer idx for the lstm layer. (#962)
|
2023-09-25 21:31:14 +01:00 |
|
|
7b1ddcff47
|
Add clone to various nn layers. (#910)
|
2023-09-20 11:33:51 +01:00 |
|
|
af552a5274
|
Fix the rnn tests for accelerate. (#704)
|
2023-09-01 13:21:38 +01:00 |
|
|
db59816087
|
Add a GRU layer. (#688)
* Add a GRU layer.
* Fix the n gate computation.
|
2023-08-31 08:43:10 +01:00 |
|
|
21e1c73892
|
Add a LSTM test. (#681)
* Add a LSTM test.
* Clippy.
|
2023-08-30 20:05:42 +02:00 |
|
|
ad8a62dbf5
|
Add tanh. (#675)
* Add tanh.
* Use tanh in the lstm block.
* Add a test for tanh forward and backward passes.
|
2023-08-30 13:54:50 +01:00 |
|
|
f35b9f6baa
|
Add some recurrent neural networks (#674)
* Add the rnn module.
* More LSTM.
* Implement the RNN forward pass.
* More forward pass for LSTM.
|
2023-08-30 13:27:09 +01:00 |
|