runtime error

Exit code: 1. Reason: pytorch_model-00001-of-00002.bin: 46%|████▌ | 2.29G/4.97G [00:06<00:05, 455MB/s] pytorch_model-00001-of-00002.bin: 55%|█████▌ | 2.75G/4.97G [00:07<00:04, 454MB/s] pytorch_model-00001-of-00002.bin: 67%|██████▋ | 3.32G/4.97G [00:08<00:03, 482MB/s] pytorch_model-00001-of-00002.bin: 77%|███████▋ | 3.81G/4.97G [00:09<00:02, 440MB/s] pytorch_model-00001-of-00002.bin: 89%|████████▉ | 4.42G/4.97G [00:10<00:01, 488MB/s] pytorch_model-00001-of-00002.bin: 100%|█████████▉| 4.97G/4.97G [00:11<00:00, 439MB/s] pytorch_model-00002-of-00002.bin: 0%| | 0.00/1.46G [00:00<?, ?B/s] pytorch_model-00002-of-00002.bin: 2%|▏ | 31.5M/1.46G [00:01<01:27, 16.3MB/s] pytorch_model-00002-of-00002.bin: 39%|███▉ | 566M/1.46G [00:02<00:03, 235MB/s]  pytorch_model-00002-of-00002.bin: 89%|████████▉ | 1.30G/1.46G [00:03<00:00, 420MB/s] pytorch_model-00002-of-00002.bin: 100%|█████████▉| 1.46G/1.46G [00:04<00:00, 340MB/s] Traceback (most recent call last): File "/home/user/app/app.py", line 11, in <module> model = AutoModelForCausalLM.from_pretrained(model_name, torch_dtype=dtype).to(device) # Ensure model is on the correct device File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 571, in from_pretrained return model_class.from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 279, in _wrapper return func(*args, **kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 4333, in from_pretrained model_init_context = cls.get_init_context(is_quantized, _is_ds_init_called) File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 3736, in get_init_context init_contexts = [no_init_weights(), init_empty_weights()] NameError: name 'init_empty_weights' is not defined

Container logs:

Fetching error logs...