You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The issue is caused by an extension, but I believe it is caused by a bug in the webui
The issue exists in the current version of the webui
The issue has not been reported before recently
The issue has been reported before but has not been fixed yet
What happened?
i just want to install xformers for make faster images on StableDiffusion but i cant, no matter what i do i cant generate anything
Steps to reproduce the problem
--xformers in webui-user.sh
generate a image
What should have happened?
Xformers should be build for Cuda support
What browsers do you use to access the UI ?
Other
Sysinfo
RTX 3060TI
64gb ram
Console logs
raise NotImplementedError(msg)
NotImplementedError: No operator found for`memory_efficient_attention_forward` with inputs:
query : shape=(2, 1024, 10, 64) (torch.float32)
key : shape=(2, 1024, 10, 64) (torch.float32)
value : shape=(2, 1024, 10, 64) (torch.float32)
attn_bias :<class 'NoneType'>
p : 0.0
`decoderF` is not supported because:
xFormers wasn't build with CUDA support attn_bias type is <class 'NoneType'> operator wasn't built - see `python -m xformers.info`for more info
`[email protected]` is not supported because:
xFormers wasn't build with CUDA support dtype=torch.float32 (supported: {torch.float16, torch.bfloat16}) operator wasn't built - see `python -m xformers.info`for more info
`tritonflashattF` is not supported because:
xFormers wasn't build with CUDA support dtype=torch.float32 (supported: {torch.float16, torch.bfloat16}) operator wasn't built - see `python -m xformers.info`for more info
triton is not available
`cutlassF` is not supported because:
xFormers wasn't build with CUDA support operator wasn't built - see `python -m xformers.info`for more info
`smallkF` is not supported because:
max(query.shape[-1] != value.shape[-1]) > 32
xFormers wasn't build with CUDA support operator wasn't built - see `python -m xformers.info`for more info
unsupported embed per head: 64
Additional information
NotImplementedError: No operator found for memory_efficient_attention_forward with inputs: query : shape=(2, 1024, 10, 64) (torch.float32) key : shape=(2, 1024, 10, 64) (torch.float32) value : shape=(2, 1024, 10, 64) (torch.float32) attn_bias : <class 'NoneType'> p : 0.0 decoderF is not supported because: xFormers wasn't build with CUDA support attn_bias type is <class 'NoneType'> operator wasn't built - see python -m xformers.info for more info [email protected] is not supported because: xFormers wasn't build with CUDA support dtype=torch.float32 (supported: {torch.bfloat16, torch.float16}) operator wasn't built - see python -m xformers.info for more info tritonflashattF is not supported because: xFormers wasn't build with CUDA support dtype=torch.float32 (supported: {torch.bfloat16, torch.float16}) operator wasn't built - see python -m xformers.info for more info triton is not available cutlassF is not supported because: xFormers wasn't build with CUDA support operator wasn't built - see python -m xformers.info for more info smallkF is not supported because: max(query.shape[-1] != value.shape[-1]) > 32 xFormers wasn't build with CUDA support operator wasn't built - see python -m xformers.info for more info unsupported embed per head: 64
(Stable Diffusion)
The text was updated successfully, but these errors were encountered:
I don't believe that's the case "xformers wasn't build with CUDA support"
if it isn't installed then it will be a completely different error message
I believe it is installed just not correctly
I may have seen similar error messages cause by a a miss match of version between xformers and pytorch
if you want people to actually help you then you would want to provide your sysinfo
sysinfo is a json file generated by web UI, not two lines, read the issue template instructions
because you did not provide sysinfo, the following is pure guesswork
my guess some distro of linux with python 3.12~ as default, irrc the default pytorch that the default version we use is not support on 3.12 and so you install a new version manually
then you use --xformers which by default installs a version of xformers meant for the old version of pytorch
if my guess is correct then I will try
I will try installing xformers yourself manually, overriding what web UI does automatically
or install an old version of python, 3.11 or 3.10 then run web ui usin it
Checklist
What happened?
i just want to install xformers for make faster images on StableDiffusion but i cant, no matter what i do i cant generate anything
Steps to reproduce the problem
What should have happened?
Xformers should be build for Cuda support
What browsers do you use to access the UI ?
Other
Sysinfo
RTX 3060TI
64gb ram
Console logs
Additional information
NotImplementedError: No operator found for
memory_efficient_attention_forward
with inputs: query : shape=(2, 1024, 10, 64) (torch.float32) key : shape=(2, 1024, 10, 64) (torch.float32) value : shape=(2, 1024, 10, 64) (torch.float32) attn_bias : <class 'NoneType'> p : 0.0decoderF
is not supported because: xFormers wasn't build with CUDA support attn_bias type is <class 'NoneType'> operator wasn't built - seepython -m xformers.info
for more info[email protected]
is not supported because: xFormers wasn't build with CUDA support dtype=torch.float32 (supported: {torch.bfloat16, torch.float16}) operator wasn't built - seepython -m xformers.info
for more infotritonflashattF
is not supported because: xFormers wasn't build with CUDA support dtype=torch.float32 (supported: {torch.bfloat16, torch.float16}) operator wasn't built - seepython -m xformers.info
for more info triton is not availablecutlassF
is not supported because: xFormers wasn't build with CUDA support operator wasn't built - seepython -m xformers.info
for more infosmallkF
is not supported because: max(query.shape[-1] != value.shape[-1]) > 32 xFormers wasn't build with CUDA support operator wasn't built - seepython -m xformers.info
for more info unsupported embed per head: 64(Stable Diffusion)
The text was updated successfully, but these errors were encountered: