Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

question about the training time #28

Open
LGD2333 opened this issue Dec 27, 2024 · 3 comments
Open

question about the training time #28

LGD2333 opened this issue Dec 27, 2024 · 3 comments

Comments

@LGD2333
Copy link

LGD2333 commented Dec 27, 2024

Could you please let me know how much time it usually takes you to complete a result?
(When I run corrmatch on a single A100 GPU with a batch size set to 8, each epoch takes almost 22 hours.)

@BBBBchan
Copy link
Owner

Thanks for your attention. On the Pascal VOC dataset, we trained using 2x3090 GPUs. With a training size of 321x321 and the 92 split setting, the training took approximately 30 hours. On the Cityscapes dataset, we trained using 4xA40 GPUs. With a training size of 801x801 and the 1/8 split setting, the training took approximately 48 hours (the theoretically longest setting).

@Wang-zhenyan
Copy link

Thanks for your attention. On the Pascal VOC dataset, we trained using 2x3090 GPUs. With a training size of 321x321 and the 92 split setting, the training took approximately 30 hours. On the Cityscapes dataset, we trained using 4xA40 GPUs. With a training size of 801x801 and the 1/8 split setting, the training took approximately 48 hours (the theoretically longest setting).

作者你好,我想问一下你在实验的时候设置的batch是16吗(即两张卡,每张卡batch=8),我在2×4090上运行Pascal Voc数据集的时候,size=513×513,每张卡最大batch只能设置为2,这样还需要近两天的时间才能完成。我在2×A100上运行Pascal Voc数据集的时候,size=513×513,每张卡最大batch也只能设置为4。
Duplicate of #

@BBBBchan
Copy link
Owner

BBBBchan commented Jan 7, 2025

Thanks for your attention. On the Pascal VOC dataset, we trained using 2x3090 GPUs. With a training size of 321x321 and the 92 split setting, the training took approximately 30 hours. On the Cityscapes dataset, we trained using 4xA40 GPUs. With a training size of 801x801 and the 1/8 split setting, the training took approximately 48 hours (the theoretically longest setting).

作者你好,我想问一下你在实验的时候设置的batch是16吗(即两张卡,每张卡batch=8),我在2×4090上运行Pascal Voc数据集的时候,size=513×513,每张卡最大batch只能设置为2,这样还需要近两天的时间才能完成。我在2×A100上运行Pascal Voc数据集的时候,size=513×513,每张卡最大batch也只能设置为4。 Duplicate of #

您好,在513x513 size的情况下,您的训练时间是正常的。我们在训练时使用了4x3090,单卡batch size为2,训练时间为32h左右。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants