You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Jul 1, 2024. It is now read-only.
We should pass a reference to the task object to the optimizer + param scheduler to be able to use more complex parameter schedulers
Motivation / Pitch
Basic PyTorch learning rate schedulers such as the ReduceLROnPlateau require task information such as access to the validation loss. Currently, it is not possible to implement this in Classy Vision because there is no access to the task by the scheduler.
By adding a task reference, the user could access the local variables, or the task meters to make informed decisions on their parameter scheduling.
This would imply also giving the optimizer access to the task during the update scheduler method.
Alternatives
As far as I know, the only (hacky) way of achieving this is by using a custom hook, but hooks are not configurable via configuration file, and it doesn't make much sense to have a Parameter Scheduler as a Hook.
Let me know what you think!
The text was updated successfully, but these errors were encountered:
This is a recurring issue -- thanks for filing this.
Instead of adding a dependency between task/scheduler, I wanted to introduce a global object to give you access to the task currently being trained (we can only have one at a time anyway). That would solve your issue, right?
I guess so... I'm not a fan of global objects, however. I don't see much harm in passing the task reference as an argument, except maybe for compatibility issues.
Would that solution work for custom train scripts other than the provided classy_train?
Maybe the trainer can set the global variable and have an importable function that gets it? Something similar to what flask does
🚀 Feature
We should pass a reference to the task object to the optimizer + param scheduler to be able to use more complex parameter schedulers
Motivation / Pitch
Basic PyTorch learning rate schedulers such as the
ReduceLROnPlateau
require task information such as access to the validation loss. Currently, it is not possible to implement this in Classy Vision because there is no access to the task by the scheduler.By adding a task reference, the user could access the local variables, or the task meters to make informed decisions on their parameter scheduling.
This would imply also giving the optimizer access to the task during the update scheduler method.
Alternatives
As far as I know, the only (hacky) way of achieving this is by using a custom hook, but hooks are not configurable via configuration file, and it doesn't make much sense to have a Parameter Scheduler as a Hook.
Let me know what you think!
The text was updated successfully, but these errors were encountered: