Replies: 2 comments 1 reply
-
Thank you, I have reinstall several version of xformers, torch, cuda... |
Beta Was this translation helpful? Give feedback.
1 reply
-
PS: what i do is avoid this ERROR: "forward_xformers() got an unexpected keyword argument 'encoder_hidden_states'" with diffusers==0.13 on linux. Obviously it is not a solution to solve the problem |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
when i run this code on linux with xformers
it reports ERROR: "forward_xformers() got an unexpected keyword argument 'encoder_hidden_states'"
i found a way to avoid it, not the elegant way, but can work with diffuser==0.13.0
python==3.9 , torch==1.13.1, xformers==0.0.16rc425
remove this line: "diffusers.models.attention.CrossAttention.forward = forward_xformers"
2.open site-packages/diffusers/models/attention.py, find "class BasicTransformerBlock", add "self.attn1.set_use_memory_efficient_attention_xformers(True)" after self.attn1 = CrossAttention(***)
Beta Was this translation helpful? Give feedback.
All reactions