DeepSpeed with multiple optimizer in pytorch ligthning #13950
Unanswered
MaugrimEP
asked this question in
DDP / multi-GPU / multi-node
Replies: 1 comment 3 replies
-
with a limitation with deepspeed that it only supports single optimizer |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I'm trying to use
deepspeed_stage_3_offload
anddeepspeed_stage_3
strategies to train a big stylegan model.Before the start of the training, I get the error
But I need two optimizers for the GAN, is there a workaround?
Beta Was this translation helpful? Give feedback.
All reactions