Text Generation
Transformers
Safetensors
gpt2
text-generation-inference
4-bit precision
gptq

TGI: gptq quantization is not supported for AutoModel

#1
by 4639-94d6 - opened
This comment has been hidden
4639-94d6 changed discussion status to closed

Sign up or log in to comment