|
fd682a94f8
|
Merge pull request #62 from LaurentMazare/safetensors_integration
Adding saving capabilities.
|
2023-07-03 15:40:00 +02:00 |
|
|
0b3cc215f1
|
Address comments.
|
2023-07-03 13:52:27 +02:00 |
|
|
5bc66c68fa
|
Adding saving capabilities.
|
2023-07-03 13:39:24 +02:00 |
|
|
fdb1acd2ff
|
Move llama in a cargo-examples directory.
|
2023-07-03 11:30:58 +01:00 |
|
|
81cec86e75
|
Adding a bit more docs around safety.
|
2023-07-03 11:55:54 +02:00 |
|
|
639270b796
|
Use the patched gemm for the time being.
|
2023-07-03 10:29:15 +01:00 |
|
|
899c76de75
|
Handle more types in safetensors.
|
2023-07-03 10:09:46 +01:00 |
|
|
783b7054ee
|
Move more safetensors bits to the shared module.
|
2023-07-03 09:34:08 +01:00 |
|
|
fe2c07e368
|
Add the ST error.
|
2023-07-03 08:44:00 +01:00 |
|
|
cf2789fb81
|
Move some safetensors bits in the candle-core crate.
|
2023-07-03 08:37:46 +01:00 |
|
|
78871ffe38
|
Add dtype support.
|
2023-07-02 20:12:26 +01:00 |
|
|
7c65e2d187
|
Add a flag for custom prompt.
|
2023-07-01 06:36:22 +01:00 |
|
|
bbe0c5fbaa
|
Do not use rayon for a single thread (bis).
|
2023-06-30 18:47:22 +01:00 |
|
|
6b67d25d9f
|
Do not use rayon for a single thread.
|
2023-06-30 18:46:32 +01:00 |
|
|
679b6987b6
|
Early conversion for the llama weights.
|
2023-06-30 16:42:53 +01:00 |
|
|
ed4d0959d3
|
Add a const to easily tweak the dtype used for llama internal computations.
|
2023-06-30 15:01:39 +01:00 |
|
|
fbc329ed85
|
Add the verbose cpu cast operations.
|
2023-06-30 10:33:29 +01:00 |
|
|
8ad47907f3
|
Add the kernels.
|
2023-06-30 10:26:56 +01:00 |
|
|
19cbbc5212
|
Improve how we check that the dims are in bounds.
|
2023-06-30 09:11:00 +01:00 |
|
|
f6152e74b6
|
Tweak the kv-cache flag.
|
2023-06-29 22:16:40 +01:00 |
|
|
ae3f202f3b
|
Add a flag.
|
2023-06-29 22:12:15 +01:00 |
|
|
23389b1bd7
|
Enable the KV cache after fixing the caching length and the rope bits.
|
2023-06-29 22:00:57 +01:00 |
|
|
b50bd880ce
|
Only narrow when needed + deactivate the kv cache.
|
2023-06-29 19:07:52 +01:00 |
|
|
3232df9458
|
Add some KV cache to llama.
|
2023-06-29 15:29:40 +01:00 |
|
|
889f7e0971
|
Merge pull request #39 from LaurentMazare/anyhow-backtrace
Add backtraces.
|
2023-06-29 13:17:53 +01:00 |
|
|
e27ee98d3f
|
Add backtraces.
|
2023-06-29 13:17:20 +01:00 |
|
|
78ec40b077
|
Typo.
|
2023-06-29 12:09:53 +00:00 |
|
|
de48e6fd59
|
Putting back main.
|
2023-06-29 12:08:35 +00:00 |
|
|
0958c588f6
|
Putting back seed.
|
2023-06-29 12:07:21 +00:00 |
|
|
c5e8f788be
|
Revert some changes.
|
2023-06-29 12:05:53 +00:00 |
|
|
e63ed6aaa3
|
Remove unwrap.
|
2023-06-29 12:04:25 +00:00 |
|
|
2fe1d3e36d
|
Moving llama to f16.
|
2023-06-29 12:00:16 +00:00 |
|
|
b4dc9f6108
|
Add a seed parameter to llama.
|
2023-06-29 12:47:19 +01:00 |
|
|
53628db3a9
|
Merge pull request #36 from LaurentMazare/fix_example
Simple example fix.
|
2023-06-29 13:36:05 +02:00 |
|
|
1913512f42
|
Simple example fix.
|
2023-06-29 11:10:57 +00:00 |
|
|
2741b39ad3
|
Use broadcasted scalars for const tensors.
|
2023-06-29 11:56:40 +01:00 |
|
|
3872dc4751
|
Merge pull request #19 from LaurentMazare/llama_safetensors
Llama safetensors
|
2023-06-29 12:49:26 +02:00 |
|
|
b4aab7b95f
|
Put more requirements on the withdtype trait.
|
2023-06-29 11:37:42 +01:00 |
|
|
c9c468e1aa
|
Use Map2 for binary ops.
|
2023-06-29 10:09:15 +01:00 |
|
|
83c7d660ca
|
Add Map2.
|
2023-06-29 10:05:06 +01:00 |
|
|
367170da45
|
Also use Map1 for embedding.
|
2023-06-29 09:45:27 +01:00 |
|
|
8ad03a5fb6
|
Use Map1 on unary ops.
|
2023-06-29 09:37:38 +01:00 |
|
|
fff13dbb4e
|
Factorize the kernel naming scheme.
|
2023-06-29 09:29:59 +01:00 |
|
|
d3c7b0d168
|
Use Map1 for sum.
|
2023-06-29 09:27:07 +01:00 |
|
|
122e334d0c
|
Simplify the pattern matching logic in the cuda backend.
|
2023-06-29 09:21:11 +01:00 |
|
|
eaa3ce359e
|
Cosmetic change.
|
2023-06-28 22:02:23 +01:00 |
|
|
1328b5cb20
|
Factor some code out.
|
2023-06-28 21:56:44 +01:00 |
|
|
c583ee0f2c
|
Add map2.
|
2023-06-28 21:38:01 +01:00 |
|
|
46c07b924c
|
Tweak some comment.
|
2023-06-28 21:10:54 +01:00 |
|
|
2ae368e98e
|
Switch from a macro to a trait to make things more generic.
|
2023-06-28 21:06:56 +01:00 |
|