Multiple GPU's across network (Distributed Inference) #5599
Unanswered
RamishSiddiqui
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hey guys,
I have multiple GPU's on current machine where webui is running and GPU's available on the network i am connected to (Essentially computers with multiple GPU's) are also available. Is there any way i can utilize those GPU's as well to offload some part of my model.
This would be really helpful as then i can connect a cluster of GPU's to webui.
Beta Was this translation helpful? Give feedback.
All reactions