how to update the prompt template in llamacpp
I'm a newbie with llamacpp
This was made to be used with ik_llama.cpp not llama.cpp
I did try changing the template with gguf_new_metadata.py but then running cli with -cnv outputted gibberish instead of just not stopping.
Either way it works via llama-server
using the right template in your front end.
This was made to be used with ik_llama.cpp not llama.cpp
I did try changing the template with gguf_new_metadata.py but then running cli with -cnv outputted gibberish instead of just not stopping.
Either way it works via
llama-server
using the right template in your front end.
yeah i can confirm it works (really) well on ik's fork(28tg 87pp on my SBC)
can one use gguf editor for this?
yeah i can confirm it works (really) well on ik's fork(28tg 87pp on my SBC)
Thanks for the confirmation.
can one use gguf editor for this?
You can try, like I said any attempts I made at changing the prompt template made it behave worse in the cli (I only use the cli to test this I only llama-server for actual use), but like I said it works fine with the right template with llama-server