ImportError: cannot import name 'Gemma3ForConditionalGeneration' from 'transformers' (/usr/local/lib/python3.10/site-packages/transformers/__init__.py)
I'm trying to use the gemma-3-12b-it model on a Hugging Face ZeroGPU Space but running into this error:
ImportError: cannot import name 'Gemma3ForConditionalGeneration' from 'transformers' (/usr/local/lib/python3.10/site-packages/transformers/init.py)
I have 'transformers==4.49.0' in my requirements.txt.
If anyone knows how to solve this error, would be a great help thanks.
I think 4.49.0 doesn't do it. There's a "special release" of 4.49.0 that also covers Gemma3 related changes:
run the following command:
pip install git+https://github.com/huggingface/[email protected]
Or add this to the requirements.txt:
transformers @ git+https://github.com/huggingface/transformers@46350f5eae87ac1d168ddfdc57a0b39b64b9a029
Hi @aryan835-datainflexion , To use Gemma-3 models, you need the latest development version of the Transformers library (4.50.0.dev0). You can install it directly from the GitHub branch using: pip install git+https://github.com/huggingface/[email protected] as mentioned [here]9https://huggingface.co/google/shieldgemma-2-4b-it#usage). Kindly try and let me know if you have any concerns.
Thank you.
I'm getting the following error when I try, has anyone been able to resolve this issue:
Defaulting to user installation because normal site-packages is not writeable
Collecting git+https://github.com/huggingface/[email protected]
Cloning https://github.com/huggingface/transformers (to revision v4.49.0-Gemma-3) to /tmp/pip-req-build-s9shtbh9
Running command git clone --filter=blob:none --quiet https://github.com/huggingface/transformers /tmp/pip-req-build-s9shtbh9
Running command git checkout -q 1c0f782fe5f983727ff245c4c1b3906f9b99eec2
Resolved https://github.com/huggingface/transformers to commit 1c0f782fe5f983727ff245c4c1b3906f9b99eec2
Installing build dependencies ... done
Getting requirements to build wheel ... done
Preparing metadata (pyproject.toml) ... done
Requirement already satisfied: filelock in /home/ayildiz/.local/lib/python3.8/site-packages (from transformers==4.50.0.dev0) (3.16.1)
Requirement already satisfied: huggingface-hub<1.0,>=0.26.0 in /home/ayildiz/.local/lib/python3.8/site-packages (from transformers==4.50.0.dev0) (0.28.1)
Requirement already satisfied: numpy>=1.17 in /home/ayildiz/.local/lib/python3.8/site-packages (from transformers==4.50.0.dev0) (1.24.4)
Requirement already satisfied: packaging>=20.0 in /home/ayildiz/.local/lib/python3.8/site-packages (from transformers==4.50.0.dev0) (24.2)
Requirement already satisfied: pyyaml>=5.1 in /usr/lib/python3/dist-packages (from transformers==4.50.0.dev0) (5.3.1)
Requirement already satisfied: regex!=2019.12.17 in /home/ayildiz/.local/lib/python3.8/site-packages (from transformers==4.50.0.dev0) (2024.11.6)
Requirement already satisfied: requests in /usr/lib/python3/dist-packages (from transformers==4.50.0.dev0) (2.22.0)
Collecting tokenizers<0.22,>=0.21 (from transformers==4.50.0.dev0)
Using cached tokenizers-0.21.0.tar.gz (343 kB)
Installing build dependencies ... done
Getting requirements to build wheel ... done
Preparing metadata (pyproject.toml) ... error
error: subprocess-exited-with-error
× Preparing metadata (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> [6 lines of output]
Cargo, the Rust package manager, is not installed or is not on PATH.
This package requires Rust and Cargo to compile extensions. Install it through
the system's package manager or via https://rustup.rs/
Checking for Rust toolchain....
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
error: metadata-generation-failed
× Encountered error while generating package metadata.
╰─> See above for output.
note: This is an issue with the package mentioned above, not pip.
hint: See above for details.
Thank you @graftim and @lkv ! I've added "transformers @ git+https://github.com/huggingface/transformers@46350f5eae87ac1d168ddfdc57a0b39b64b9a029" in my requirements.txt which seems to have fixed the issue.