How to load model in docker offline
#43
by
YTGao
- opened
Hi, I wanna load starcoder offline on my server cause it's too large. When I rundocker run -p 8080:80 -v $PWD/data:/data -e HUGGING_FACE_HUB_TOKEN=<YOUR BIGCODE ENABLED TOKEN> -it ghcr.io/huggingface/text-generation-inference:latest --model-id bigcode/starcoder --max-total-tokens 8192
it always download starcoder, It seems that there is no parameters to let me load models offline. How can I fix it? Thanks.
YTGao
changed discussion status to
closed