AttributeError: type object 'Qwen2_5OmniConfig' has no attribute 'thinker_config'

#46
by mstachow - opened

I'm using the sample code provided and so far as I know I have installed all of the various libraries needed, but I keep getting this error:
mstachow@ece-nebula04:~/.cache/huggingface/hub/models--Qwen--Qwen2.5-Omni-7B/snapshots/08f233e162d7b5042d4c15fe3702ef1a9fe2ea68$ /usr/bin/python3 /home/mstachow/qwen_omni/omni.py
Unrecognized keys in rope_scaling for 'rope_type'='default': {'mrope_section'}
Traceback (most recent call last):
File "/home/mstachow/qwen_omni/omni.py", line 10, in
model = Qwen2_5OmniForConditionalGeneration.from_pretrained("Qwen/Qwen2.5-Omni-7B", torch_dtype="auto", device_map="auto")
File "/usr/local/lib/python3.10/dist-packages/transformers/models/qwen2_5_omni/modeling_qwen2_5_omni.py", line 4425, in from_pretrained
model = super().from_pretrained(
File "/usr/local/lib/python3.10/dist-packages/transformers/modeling_utils.py", line 282, in _wrapper
return func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/transformers/modeling_utils.py", line 4413, in from_pretrained
model = cls(config, *model_args, **model_kwargs)
File "/usr/local/lib/python3.10/dist-packages/transformers/models/qwen2_5_omni/modeling_qwen2_5_omni.py", line 4381, in init
super().init(config)
File "/usr/local/lib/python3.10/dist-packages/transformers/modeling_utils.py", line 1867, in init
self.generation_config = GenerationConfig.from_model_config(config) if self.can_generate() else None
File "/usr/local/lib/python3.10/dist-packages/transformers/generation/configuration_utils.py", line 1290, in from_model_config
decoder_config = model_config.get_text_config(decoder=True)
File "/usr/local/lib/python3.10/dist-packages/transformers/models/qwen2_5_omni/configuration_qwen2_5_omni.py", line 1061, in get_text_config
return self.thinker_config.get_text_config()
AttributeError: type object 'Qwen2_5OmniConfig' has no attribute 'thinker_config'

This is on basic transformers I believe, nothing special. Any ideas how to fix it?

This PR should fix it; you can modify a local installation of the transformers library as well to fix the issue: https://github.com/huggingface/transformers/pull/37690

Confirmed, thanks!

mstachow changed discussion status to closed

Sign up or log in to comment