Inference endpoint not working

#17
by ArunSharma93 - opened

When deploying and inference endpoint for this model. I get the following error:

ValueError: The checkpoint you are trying to load has model type smolvlm but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.

Hugging Face TB Research org

or because your version of Transformers is out of date.

You can install the version with

pip install git+https://github.com/huggingface/[email protected]

See the release notes for more information: https://github.com/huggingface/transformers/releases/tag/v4.49.0-SmolVLM-2

How do I do this for an Inference endpoint? From what I can see, there is no option to manually install packages e.g. upgrade Transformers. I can only deploy an endpoint and use it?

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment