output issue

#14
by mobo68 - opened

Hi,

I deployed the model using the Docker image of vLLM 0.9.1 with the following parameters:

--tokenizer-mode mistral --config-format mistral --load-format mistral --tool-call-parser mistral --enable-auto-tool-choice

However, I'm getting some strange characters in the output, such as Chinese characters or gibberish words.

Am I missing something?
Thanks!

looks like it happen in all languages, regardless of the UI or script.
If you not using english with model, random english words and chinese words getting mixed into response from time to time...

I'm not sure if this is related, but maybe try building vLLM from source:

https://github.com/vllm-project/vllm/pull/19533

This PR is not released in 0.9.1

Sign up or log in to comment