This website requires JavaScript.
Explore
Help
Register
Sign In
huggingface
/
candle
Watch
1
Star
0
Fork
0
You've already forked candle
mirror of
https://github.com/huggingface/candle.git
synced
2025-06-21 04:10:46 +00:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
Files
92a05b51cf8c88e40f03f6431b81f68a456f633f
candle
/
candle-transformers
/
src
/
models
/
wuerstchen
History
Laurent Mazare
fb1c2ac535
Add flash-attn support. (
#912
)
...
* Add flash-attn support. * Add the use-flash-attn flag. * Re-enable flash-attn.
2023-09-20 14:07:55 +01:00
..
attention_processor.rs
Add flash-attn support. (
#912
)
2023-09-20 14:07:55 +01:00
common.rs
Add flash-attn support. (
#912
)
2023-09-20 14:07:55 +01:00
ddpm.rs
Line-up the wuerstchen model with the python implementation. (
#901
)
2023-09-19 21:59:44 +01:00
diffnext.rs
Add flash-attn support. (
#912
)
2023-09-20 14:07:55 +01:00
mod.rs
Specialized attention module for Wuerstchen. (
#890
)
2023-09-18 21:16:09 +01:00
paella_vq.rs
Add some missing biases. (
#908
)
2023-09-20 10:14:51 +01:00
prior.rs
Add flash-attn support. (
#912
)
2023-09-20 14:07:55 +01:00