just a question about params

#17
by xubin-bruce - opened

Qwen2.5-Omni-7B means it's a 7B model
but model size I see in huggingface is 10.7B
and total size of safetensors is about 22GB
so How to connect 7B and 10.7B

4B is fp32, 3B is bf16?

the same question

==================================================
Model type               : qwen2_5_omni
Total param num          : 10.732225408 B
==================================================
model.thinker            : 8.931813888 B
model.talker             : 1.351360256 B
model.token2wav          : 0.449051264 B
model.thinker.audio_tower        : 0.639647232 B
model.thinker.visual             : 0.676550144 B
model.thinker.model              : 7.070619136 B
model.thinker.lm_head            : 0.544997376 B

As the parameters shown above, 7B model is just the LLM part, i.e. Qwen2.5. If we count multimodal encoder and voice generator, it's 10B.

==================================================
Model type               : qwen2_5_omni
Total param num          : 10.732225408 B
==================================================
model.thinker            : 8.931813888 B
model.talker             : 1.351360256 B
model.token2wav          : 0.449051264 B
model.thinker.audio_tower        : 0.639647232 B
model.thinker.visual             : 0.676550144 B
model.thinker.model              : 7.070619136 B
model.thinker.lm_head            : 0.544997376 B

As the parameters shown above, 7B model is just the LLM part, i.e. Qwen2.5. If we count multimodal encoder and voice generator, it's 10B.

thanks very much, could I see it directly anywhere?

thanks very much, could I see it directly anywhere?

Under my knowledge, this is no easy way to get module parameter info through some visual tools / webui. This info is printed by my script. I try to open-source it this weekend.

thanks very much, could I see it directly anywhere?

Under my knowledge, this is no easy way to get module parameter info through some visual tools / webui. This info is printed by my script. I try to open-source it this weekend.

For anyone who is interested in this script, it's located in display_automodel_params.py.

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment