Replies: 1 comment
-
>>> SamahZaro |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
>>> SamahZaro
[August 22, 2019, 6:20pm]
Is there an easy way to track the learning rate during training?
I tried to see which field of the AdamOptimizer object, gradients or the
Operation(after apply_gradients() call) is used to keep track the
changing learning rate value, but failed. any help?
Another related question: slash
when resuming training ( from the last checkpoint, for example) does it
restore the optimizer and the last reached learning rate value? or it
will initialize a new optimizer with the given lr parsed by the flags?
[This is an archived TTS discussion thread from discourse.mozilla.org/t/track-learning-rate]
Beta Was this translation helpful? Give feedback.
All reactions