Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Why train dataloader is not prepared by Accelerator #594

Open
Jiaxin-Wen opened this issue May 18, 2024 · 0 comments
Open

Why train dataloader is not prepared by Accelerator #594

Jiaxin-Wen opened this issue May 18, 2024 · 0 comments
Labels
bug Something isn't working

Comments

@Jiaxin-Wen
Copy link

🐛 Describe the bug

return self.store.create_loader(self.config.train.batch_size, shuffle=True)

the train dataloader used in PPO is not prepared by Accelerator.

Then in the training phase (
(

for minibatch in MiniBatchIterator(train_dataloader, self.mb_size, self.num_mb):
)), will the minibatch_size be used as the batch size per device or not?

Which trlX version are you using?

No response

Additional system and package information

No response

@Jiaxin-Wen Jiaxin-Wen added the bug Something isn't working label May 18, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant