how to update the prompt template in llamacpp

#1
by PlatonicSkeptic - opened

I'm a newbie with llamacpp

This was made to be used with ik_llama.cpp not llama.cpp

I did try changing the template with gguf_new_metadata.py but then running cli with -cnv outputted gibberish instead of just not stopping.

Either way it works via llama-server using the right template in your front end.

This was made to be used with ik_llama.cpp not llama.cpp

I did try changing the template with gguf_new_metadata.py but then running cli with -cnv outputted gibberish instead of just not stopping.

Either way it works via llama-server using the right template in your front end.

yeah i can confirm it works (really) well on ik's fork(28tg 87pp on my SBC)

can one use gguf editor for this?

https://huggingface.co/spaces/CISCai/gguf-editor

yeah i can confirm it works (really) well on ik's fork(28tg 87pp on my SBC)

Thanks for the confirmation.

can one use gguf editor for this?

https://huggingface.co/spaces/CISCai/gguf-editor

You can try, like I said any attempts I made at changing the prompt template made it behave worse in the cli (I only use the cli to test this I only llama-server for actual use), but like I said it works fine with the right template with llama-server

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment