Skip to content

How many epochs did you use to train? #111

Answered by fracape
hyeseojy asked this question in Q&A
Discussion options

You must be logged in to vote

Hi,
Yes it's correct, it's in this ballpark, although the number of epochs usually depends on a termination criterion, as we mention in the paper:
"Training Unless specified otherwise, the provided pre-trained networks were trained for 4-5M steps on
256 × 256 image patches randomly extracted and cropped from the Vimeo-90K dataset [22].
Models were trained with a batch size of 16 or 32, and an initial learning rate of 1e-4 for approximately
1-2M steps. The learning rate is then divided by 2 whenever the evaluation loss reaches a plateau (we use a
patience of 20 epochs)."

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by hyeseojy
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants