Suggestion to Out-Of-Memory Error (with 4 GPU) #1051
Unanswered
KaiserWhoLearns
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I am trying to only use
T5-small
for fine-tuning model on a custom dataset (intsv
format) with 4 GPUs but keep getting Out-Of-Memory error.I believe this is very unlikely as my GPU should have more than enough memory to fine-tune the small model, so there should be something wrong with the command I used:
Any suggestion on what I can modify to resolve the issue?
(It looks like it is out of memory on the first GPU, so maybe I happened to used a invalid command for multi-GPU?)
Beta Was this translation helpful? Give feedback.
All reactions