Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CUDA out of memory with 24gb VRAM #34

Open
YakovAU opened this issue Mar 20, 2024 · 3 comments
Open

CUDA out of memory with 24gb VRAM #34

YakovAU opened this issue Mar 20, 2024 · 3 comments

Comments

@YakovAU
Copy link

YakovAU commented Mar 20, 2024

image_2024-03-21_012400781
Im getting a CUDA out of memory error with continual inpainting and segmenting. seems to build up VRAM usage and not release it. here it is stuck at 1/20 and continually eating up more ram, over 50gb.

@Uminosachi
Copy link
Owner

I was unable to reproduce the issue of increasing VRAM usage when repeatedly using SAM and Inpainting on Linux.

@YakovAU
Copy link
Author

YakovAU commented Mar 22, 2024

Did you try swapping models frequently? I left that bit out. it might not be unloading models correctly, say swapping segmenting models then swapping the inpaint model a few times. I've recreated it on two computers, a RTX2080ti, and my main PC, RTX 4090, both running windows 11, CUDA 12.4, latest nvidia drivers. no other issues in comfyui or a1111 inpainting with the main interface and swapping models as normal

@Uminosachi
Copy link
Owner

I monitored the VRAM usage with nvidia-smi -l and found that VRAM is properly released after SAM and Inpainting operations conclude, which seems to prevent the issue from being reproduced on Linux.
I will continue to investigate this issue in a Windows environment as well.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants