Skip to content

[Bug]: DeepSeek-V3.2 As of transformers v4.44, default chat template is no longer allowed, so you must provide a chat template if the tokenizer does not define one #29849

@deepblacksky

Description

@deepblacksky

Your current environment

vllm:v0.11.2

🐛 Describe the bug

Using vllm to deploy the latest deepseek-v3.2 model
the deployment was successful

vllm server --model /workspace/LLM_Weights/deepseek-ai/DeepSeek-V3.2 --port 40001 --host 0.0.0.0 --trust-remote-code --served-model-name deepseek-v3.2-new --max-model-len 131072 --gpu-memory-utilization 0.95 -dp 8 --enable-expert-parallel --max-num-seqs 256

Image

an error occurred when invoking it

ValueError: As of transformers v4.44, default chat template is no longer allowed, so you must provide a chat template if the tokenizer does not define one.

Image

Before submitting a new issue...

  • Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions