Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
tune down batch-size for res2net to avoid OOM (#122977)
Summary: The batch-size for this model is 64 previously. Later on we change that to 256 and cause OOM in cudagraphs setting. This PR tune the batch size down to 128. Share more logs from my local run ``` cuda,res2net101_26w_4s,128,1.603578,110.273572,335.263494,1.042566,11.469964,11.001666,807,2,7,6,0,0 cuda,res2net101_26w_4s,256,1.714980,207.986155,344.013071,1.058278,22.260176,21.034332,807,2,7,6,0,0 ``` The log shows that torch.compile uses 11GB for 128 batch size and 21GB for 256 batch size. I guess the benchmark script has extra overhead cause the model OOM for 256 batch size in the dashboard run. X-link: pytorch/pytorch#122977 Approved by: https://github.com/Chillee Reviewed By: atalman Differential Revision: D55561255 Pulled By: shunting314 fbshipit-source-id: 9863e86776d8ed30397806bda330f53c9815f61e
- Loading branch information