You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I can confirm this behavior. I think line 31 in warmup_scheduler/scheduler.py is troublesome, and that
return self.after_scheduler.get_last_lr()
should rather be:
return self.after_scheduler.get_lr()
I do however think the whole scheduler would be easier / less error-prone to implement using the built-in PyTorch LR scheduler LinearLR for the warmup part, optionally chained with one or more other schedulers (the equivalent of "after_scheduler") using SequentialLR.
Just to nuance my comment: For some reason it actually seems users are not supposed to call the .get_lr() function. It generates a warning message if called from elsewhere than .step(), in which case this is indicated by a with _enable_get_lr_call(self): statement.
While I modify the example code like this:
I get an unexcepted result, the sixth epoch is strange
The text was updated successfully, but these errors were encountered: