Clear example for custom DistributedSampler #7616
Replies: 2 comments 8 replies
-
Are the PyTorch docs for it not enough?: https://pytorch.org/docs/stable/data.html#torch.utils.data.distributed.DistributedSampler Note that we do set this sampler for you automatically. If you could share a snippet of what you are trying to do, it would be easier to help 😄 |
Beta Was this translation helpful? Give feedback.
-
@Rizhiy Where exactly are you constructing that sampler? The error message tells me you do that too early, at a stage where distributed has not been initialized. Hope this answer helps you. |
Beta Was this translation helpful? Give feedback.
-
Hi, I need to train DDP, but I would like to use my own custom
DistributedSampler
.I've looked through different issues, but still don't understand how to set up a custom
DistributedSampler
, I keep getting:RuntimeError: Default process group has not been initialized, please make sure to call init_process_group.
Can someone please provide a complete example of how to use a custom
DistributedSampler
?Beta Was this translation helpful? Give feedback.
All reactions