Missing chat_template when using VLLM to serve the model
#5
by
leihhami
- opened
VLLM can serve the model, but query will return the following error.ValueError: As of transformers v4.44, default chat template is no longer allowed, so you must provide a chat template if the tokenizer does not define one.
Tried some templates from vllm https://github.com/vllm-project/vllm/tree/main/examples, but they are not working either.