Is the reasoning parser going to be available in vllm soon?

#29
by dingobingobango - opened

when deploying the model, and setting the reasoning parser to "kimi_k2" using vllm 0.11, i got the following vllm error below:

usage: vllm serve [model_tag] [options]
vllm serve: error: argument --reasoning-parser: invalid choice: 'kimi_k2' (choose from 'deepseek_r1', 'glm45', 'openai_gptoss', 'granite', 'hunyuan_a13b', 'mistral', 'qwen3', 'seed_oss', 'step3')

Moonshot AI org

you need to use vllm nightly, see https://blog.vllm.ai/2025/01/10/dev-experience.html .

it should be available in the next release, i.e.g > 0.11.0

Sign up or log in to comment