|
1e86717bf2
|
Fix a couple typos (#1451)
* Mixtral quantized instruct.
* Fix a couple typos.
|
2023-12-17 05:20:05 -06:00 |
|
|
805bf9ffa7
|
Implement top_p / nucleus sampling (#819)
* Implement top_p / nucleus sampling
* Update changelog
* rustfmt
* Add tests
* Fix clippy warning
* Fix another clippy error
|
2023-09-12 18:10:16 +02:00 |
|
|
8395152d20
|
Llama2c WASM UI improvements (#732)
* pass seed, expose model seq_len
* wip new llama2.c ui
* final new UI example
* small coppy
* copy
|
2023-09-04 15:59:22 +01:00 |
|
|
2fef14cb14
|
Add a repeat penalty to the llama2.c wasm example. (#709)
|
2023-09-01 19:32:28 +01:00 |
|
|
8e84d8a59b
|
Llama2.c wasm module. (#686)
|
2023-08-31 07:44:32 +01:00 |
|
|
52414ba5c8
|
Bugfix for the llama2 wasm example. (#310)
* Clean-up the llama2.c wasm example.
* Use a proper tokenizer.
* Add a prompt.
* Bugfix for the llama2 wasm example.
|
2023-08-02 17:32:36 +01:00 |
|
|
186c308d51
|
Wasm llama2 tweaks (#309)
* Clean-up the llama2.c wasm example.
* Use a proper tokenizer.
|
2023-08-02 15:49:43 +01:00 |
|
|
3eb2bc6d07
|
Softmax numerical stability. (#267)
* Softmax numerical stability.
* Fix the flash-attn test.
|
2023-07-28 13:13:01 +01:00 |
|
|
160ba09d30
|
Polish the llama2 wasm ui. (#232)
* Polish the llama2 wasm ui.
* readme update.
|
2023-07-24 15:28:27 +01:00 |
|
|
5a26cba733
|
Re-organize the wasm examples (#231)
* Move the whisper example.
* More renaming.
* Add llama2 as a new wasm example.
* Live generation.
* More of the llama wasm example.
* Formatting.
|
2023-07-24 12:36:02 +01:00 |
|