Is LM Studio support the ‘mllama’ architecture?
LM studio 0.3.5
I tried to load this model on LM Studio, but it kept giving me an error message, Failed to load model, saying
"llama.cpp error: 'error loading model architecture: unknown model architecture: 'mllama''
I'm wondering whether LM Studio support the 'mllama'?
it supports it with MLX, but not GGUF
ollama supports the gguff version. Isn't ollama just wrapping llama.cpp here?
nah ollama deviated for vision a couple months back and added their own way to use vision adapters and didn't upstream it unfortunately :( nice for them to use, but yeah llama.cpp remains without support for this
Well that’s just sad. 😔
@sdalemorrey if you're interested in the progress for this one, it might be a good idea to follow this PR: https://github.com/ggerganov/llama.cpp/pull/11292