Add chat_template to tokenizer_config

#58
by alexmarques - opened

This fix allows certain LLM evaluation harnesses (e.g., lm-evaluation-harness) to use the tokenizer instead of the processor to tokenize the inputs.

The 3.1 model supports image inputs and that needs to be accounted for in the chat template.

Ready to merge
This branch is ready to get merged automatically.
Your need to confirm your account before you can post a new comment.

Sign up or log in to comment