This website requires JavaScript.
Explore
Help
Register
Sign In
huggingface
/
candle
Watch
1
Star
0
Fork
0
You've already forked candle
mirror of
https://github.com/huggingface/candle.git
synced
2025-06-16 18:48:51 +00:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
Files
89fd988836d2841c5c1eb197501a8e7db03675f1
candle
/
candle-examples
/
examples
/
llama
History
Laurent Mazare
f052ba76cb
Lining up the flash attn version with the non-flash one. (
#248
)
...
* Move the flash-attn function in the proper crate. * Causality tweak.
2023-07-26 15:11:45 +01:00
..
convert_checkpoint.py
Resurrect the llama npy support. (
#140
)
2023-07-11 19:32:10 +01:00
main.rs
Better handling of dtypes in llama. (
#243
)
2023-07-26 08:28:33 +01:00
model.rs
Lining up the flash attn version with the non-flash one. (
#248
)
2023-07-26 15:11:45 +01:00