请问flash-attn可以关闭吗?是否可以直接使用transformers库里提供的qwen2模型加载?

#28
by shizue - opened

如题。

Alibaba-NLP org

flash-attn不能关闭

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment