ImportError: cannot import name '_flash_supports_window_size' from 'transformers.modeling_flash_attention_utils'

#2
by XOGorKi - opened

Any idea how to solve this issue? I used Transformers 4.50.0 at the beginning, downgraded to 4.47.0 as mentioned in the documentation of the model. Later I found that maybe the '_flash_supports_window_size' hasn’t yet made it into any released version of transformers. So, I reinstalled Transformers directly from the GitHub main branch using this command: "pip install git+https://github.com/huggingface/transformers.git@main " but still facing the same issue.
PS : I'm using google colab notebook.

I think you need to pip install flash-attn first, then you will be able to load and run the model.

Sign up or log in to comment