|
c78ce76501
|
Add a simple Module trait and implement it for the various nn layers (#500)
* Start adding the module trait.
* Use the module trait.
* Implement module for qmatmul.
|
2023-08-18 09:38:22 +01:00 |
|
|
13401df4d1
|
Add an abstract type for RmsNorm. (#499)
|
2023-08-18 08:52:14 +01:00 |
|
|
d32e8199cd
|
Layer norm tweaks (#482)
* Add some options to make layer-norm more configurable.
* Add the rms-norm variant.
* Replace the RmsNorm with the shared bits.
|
2023-08-17 10:07:13 +01:00 |
|
|
186c308d51
|
Wasm llama2 tweaks (#309)
* Clean-up the llama2.c wasm example.
* Use a proper tokenizer.
|
2023-08-02 15:49:43 +01:00 |
|
|
4bf2ebf836
|
Use u8 tensors for masks. (#273)
|
2023-07-29 11:32:58 +01:00 |
|
|
3eb2bc6d07
|
Softmax numerical stability. (#267)
* Softmax numerical stability.
* Fix the flash-attn test.
|
2023-07-28 13:13:01 +01:00 |
|
|
160ba09d30
|
Polish the llama2 wasm ui. (#232)
* Polish the llama2 wasm ui.
* readme update.
|
2023-07-24 15:28:27 +01:00 |
|
|
5a26cba733
|
Re-organize the wasm examples (#231)
* Move the whisper example.
* More renaming.
* Add llama2 as a new wasm example.
* Live generation.
* More of the llama wasm example.
* Formatting.
|
2023-07-24 12:36:02 +01:00 |
|