Add a repeat penality to the llama2-c command line example. (#713)

* Add a repeat penality to the llama2-c command line example.

* Another fix attempt.
This commit is contained in:
Laurent Mazare
2023-09-01 21:38:58 +02:00
committed by GitHub
parent 4d56cef583
commit 2c1df6bba1
2 changed files with 19 additions and 1 deletions

View File

@ -25,7 +25,7 @@ impl Model {
candle_transformers::utils::apply_repeat_penalty(
&logits,
self.repeat_penalty,
&tokens[start_at..],
&self.tokens[start_at..],
)?
};