Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: I can't install xFormers #16810

Open
1 of 6 tasks
Franches13 opened this issue Jan 26, 2025 · 1 comment
Open
1 of 6 tasks

[Bug]: I can't install xFormers #16810

Franches13 opened this issue Jan 26, 2025 · 1 comment
Labels
bug-report Report of a bug, yet to be confirmed

Comments

@Franches13
Copy link

Checklist

  • The issue exists after disabling all extensions
  • The issue exists on a clean installation of webui
  • The issue is caused by an extension, but I believe it is caused by a bug in the webui
  • The issue exists in the current version of the webui
  • The issue has not been reported before recently
  • The issue has been reported before but has not been fixed yet

What happened?

i just want to install xformers for make faster images on StableDiffusion but i cant, no matter what i do i cant generate anything

Steps to reproduce the problem

  1. --xformers in webui-user.sh
  2. generate a image

What should have happened?

Xformers should be build for Cuda support

What browsers do you use to access the UI ?

Other

Sysinfo

RTX 3060TI
64gb ram

Console logs

raise NotImplementedError(msg)
    NotImplementedError: No operator found for `memory_efficient_attention_forward` with inputs:
         query       : shape=(2, 1024, 10, 64) (torch.float32)
         key         : shape=(2, 1024, 10, 64) (torch.float32)
         value       : shape=(2, 1024, 10, 64) (torch.float32)
         attn_bias   : <class 'NoneType'>
         p           : 0.0
    `decoderF` is not supported because:
        xFormers wasn't build with CUDA support
        attn_bias type is <class 'NoneType'>
        operator wasn't built - see `python -m xformers.info` for more info
    `[email protected]` is not supported because:
        xFormers wasn't build with CUDA support
        dtype=torch.float32 (supported: {torch.float16, torch.bfloat16})
        operator wasn't built - see `python -m xformers.info` for more info
    `tritonflashattF` is not supported because:
        xFormers wasn't build with CUDA support
        dtype=torch.float32 (supported: {torch.float16, torch.bfloat16})
        operator wasn't built - see `python -m xformers.info` for more info
        triton is not available
    `cutlassF` is not supported because:
        xFormers wasn't build with CUDA support
        operator wasn't built - see `python -m xformers.info` for more info
    `smallkF` is not supported because:
        max(query.shape[-1] != value.shape[-1]) > 32
        xFormers wasn't build with CUDA support
        operator wasn't built - see `python -m xformers.info` for more info
        unsupported embed per head: 64

Additional information

NotImplementedError: No operator found for memory_efficient_attention_forward with inputs: query : shape=(2, 1024, 10, 64) (torch.float32) key : shape=(2, 1024, 10, 64) (torch.float32) value : shape=(2, 1024, 10, 64) (torch.float32) attn_bias : <class 'NoneType'> p : 0.0 decoderF is not supported because: xFormers wasn't build with CUDA support attn_bias type is <class 'NoneType'> operator wasn't built - see python -m xformers.info for more info [email protected] is not supported because: xFormers wasn't build with CUDA support dtype=torch.float32 (supported: {torch.bfloat16, torch.float16}) operator wasn't built - see python -m xformers.info for more info tritonflashattF is not supported because: xFormers wasn't build with CUDA support dtype=torch.float32 (supported: {torch.bfloat16, torch.float16}) operator wasn't built - see python -m xformers.info for more info triton is not available cutlassF is not supported because: xFormers wasn't build with CUDA support operator wasn't built - see python -m xformers.info for more info smallkF is not supported because: max(query.shape[-1] != value.shape[-1]) > 32 xFormers wasn't build with CUDA support operator wasn't built - see python -m xformers.info for more info unsupported embed per head: 64

(Stable Diffusion)

@Franches13 Franches13 added the bug-report Report of a bug, yet to be confirmed label Jan 26, 2025
@w-e-w
Copy link
Collaborator

w-e-w commented Feb 4, 2025

I can't install xFormers

I don't believe that's the case "xformers wasn't build with CUDA support"
if it isn't installed then it will be a completely different error message
I believe it is installed just not correctly

I may have seen similar error messages cause by a a miss match of version between xformers and pytorch
if you want people to actually help you then you would want to provide your sysinfo

sysinfo is a json file generated by web UI, not two lines, read the issue template instructions


because you did not provide sysinfo, the following is pure guesswork
my guess some distro of linux with python 3.12~ as default, irrc the default pytorch that the default version we use is not support on 3.12 and so you install a new version manually
then you use --xformers which by default installs a version of xformers meant for the old version of pytorch

if my guess is correct then I will try
I will try installing xformers yourself manually, overriding what web UI does automatically
or install an old version of python, 3.11 or 3.10 then run web ui usin it

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug-report Report of a bug, yet to be confirmed
Projects
None yet
Development

No branches or pull requests

2 participants