-
Thank you for your amazing work! How many epochs did you use to train? Thanks again for your excellent work. Looking forward to your reply. |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Hi, |
Beta Was this translation helpful? Give feedback.
Hi,
Yes it's correct, it's in this ballpark, although the number of epochs usually depends on a termination criterion, as we mention in the paper:
"Training Unless specified otherwise, the provided pre-trained networks were trained for 4-5M steps on
256 × 256 image patches randomly extracted and cropped from the Vimeo-90K dataset [22].
Models were trained with a batch size of 16 or 32, and an initial learning rate of 1e-4 for approximately
1-2M steps. The learning rate is then divided by 2 whenever the evaluation loss reaches a plateau (we use a
patience of 20 epochs)."