mirror of
https://github.com/huggingface/candle.git
synced 2025-06-16 10:38:54 +00:00

* added chatGLM readme * changed wording in readme * added readme for chinese-clip * added readme for convmixer * added readme for custom ops * added readme for efficientnet * added readme for llama * added readme to mnist-training * added readme to musicgen * added readme to quantized-phi * added readme to starcoder2 * added readme to whisper-microphone * added readme to yi * added readme to yolo-v3 * added readme to whisper-microphone * added space to example in glm4 readme * fixed mamba example readme to run mamba instead of mamba-minimal * removed slash escape character * changed moondream image to yolo-v8 example image * added procedure for making the reinforcement-learning example work with a virtual environment on my machine * added simple one line summaries to the example readmes without * changed non-existant image to yolo example's bike.jpg * added backslash to sam command * removed trailing - from siglip * added SoX to silero-vad example readme * replaced procedure for uv on mac with warning that uv isn't currently compatible with pyo3 * added example to falcon readme * added --which arg to stella-en-v5 readme * fixed image path in vgg readme * fixed the image path in the vit readme * Update README.md * Update README.md * Update README.md --------- Co-authored-by: Laurent Mazare <laurent.mazare@gmail.com>
18 lines
663 B
Markdown
18 lines
663 B
Markdown
# candle-mamba: Mamba implementation
|
|
|
|
Candle implementation of *Mamba* [1] inference only. Mamba is an alternative to
|
|
the transformer architecture. It leverages State Space Models (SSMs) with the
|
|
goal of being computationally efficient on long sequences. The implementation is
|
|
based on [mamba.rs](https://github.com/LaurentMazare/mamba.rs).
|
|
|
|
- [1]. [Mamba: Linear-Time Sequence Modeling with Selective State Spaces](https://arxiv.org/abs/2312.00752).
|
|
|
|
Compared to the mamba-minimal example, this version is far more efficient but
|
|
would only work for inference.
|
|
## Running the example
|
|
|
|
```bash
|
|
$ cargo run --example mamba --release -- --prompt "Mamba is the"
|
|
```
|
|
|