mirror of
https://github.com/huggingface/candle.git
synced 2025-06-17 19:18:50 +00:00
Mistral readme (#994)
* Mistral: print the generated text. * Add mistral to the readmes.
This commit is contained in:
@ -62,6 +62,8 @@ We also provide a some command line based examples using state of the art models
|
|||||||
- [LLaMA and LLaMA-v2](./candle-examples/examples/llama/): general LLM.
|
- [LLaMA and LLaMA-v2](./candle-examples/examples/llama/): general LLM.
|
||||||
- [Falcon](./candle-examples/examples/falcon/): general LLM.
|
- [Falcon](./candle-examples/examples/falcon/): general LLM.
|
||||||
- [Phi-v1.5](./candle-examples/examples/phi/): a 1.3b general LLM with performance on par with LLaMA-v2 7b.
|
- [Phi-v1.5](./candle-examples/examples/phi/): a 1.3b general LLM with performance on par with LLaMA-v2 7b.
|
||||||
|
- [Mistral7b-v0.1](./candle-examples/examples/mistral/): a 7b general LLM with
|
||||||
|
performance larger than all publicly available 13b models as of 2023-09-28.
|
||||||
- [StarCoder](./candle-examples/examples/bigcode/): LLM specialized to code generation.
|
- [StarCoder](./candle-examples/examples/bigcode/): LLM specialized to code generation.
|
||||||
- [Quantized LLaMA](./candle-examples/examples/quantized/): quantized version of
|
- [Quantized LLaMA](./candle-examples/examples/quantized/): quantized version of
|
||||||
the LLaMA model using the same quantization techniques as
|
the LLaMA model using the same quantization techniques as
|
||||||
@ -149,6 +151,7 @@ If you have an addition to this list, please submit a pull request.
|
|||||||
- Falcon.
|
- Falcon.
|
||||||
- StarCoder.
|
- StarCoder.
|
||||||
- Phi v1.5.
|
- Phi v1.5.
|
||||||
|
- Mistral 7b v0.1.
|
||||||
- T5.
|
- T5.
|
||||||
- Bert.
|
- Bert.
|
||||||
- Whisper (multi-lingual support).
|
- Whisper (multi-lingual support).
|
||||||
|
37
candle-examples/examples/mistral/README.md
Normal file
37
candle-examples/examples/mistral/README.md
Normal file
@ -0,0 +1,37 @@
|
|||||||
|
# candle-mistral: 7b LLM with Apache 2.0 license weights
|
||||||
|
|
||||||
|
- [Announcement Blog Post](https://mistral.ai/news/announcing-mistral-7b/).
|
||||||
|
- [Model card](https://huggingface.co/mistralai/Mistral-7B-v0.1) on the
|
||||||
|
HuggingFace Hub.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
$ cargo run --example mistral --release --features cuda -- --prompt 'Write helloworld code in Rust' --sample-len 150
|
||||||
|
|
||||||
|
Generated text:
|
||||||
|
Write helloworld code in Rust
|
||||||
|
=============================
|
||||||
|
|
||||||
|
This is a simple example of how to write "Hello, world!" program in Rust.
|
||||||
|
|
||||||
|
## Compile and run
|
||||||
|
|
||||||
|
``bash
|
||||||
|
$ cargo build --release
|
||||||
|
Compiling hello-world v0.1.0 (/home/user/rust/hello-world)
|
||||||
|
Finished release [optimized] target(s) in 0.26s
|
||||||
|
$ ./target/release/hello-world
|
||||||
|
Hello, world!
|
||||||
|
``
|
||||||
|
|
||||||
|
## Source code
|
||||||
|
|
||||||
|
``rust
|
||||||
|
fn main() {
|
||||||
|
println!("Hello, world!");
|
||||||
|
}
|
||||||
|
``
|
||||||
|
|
||||||
|
## License
|
||||||
|
|
||||||
|
This example is released under the terms
|
||||||
|
```
|
Reference in New Issue
Block a user