Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FEAT] Add example of modifying the default configure_optimizers() behavior (use of ReduceLROnPlateau scheduler) #1015

Open
wants to merge 21 commits into
base: main
Choose a base branch
from

Conversation

JQGoh
Copy link
Contributor

@JQGoh JQGoh commented May 23, 2024

Rationale

  • Users have asked about whether we can modify the default learning rate scheduler behaviors. See How to Use Different Optimizers with NeuralForecast Models #852 (comment) and https://nixtlacommunity.slack.com/archives/C031M8RLC66/p1713516602654109
  • This provides an example of how to achieve a full control of configure_optimizers() behavior and how we can use ReduceLROnPlateau scheduler during the optimization
  • Instead of adding arguments to all the models, passing from NeuralForecast class, we introduce a set_configure_optimizers function for the BaseModel class such that individual model can overwrite the default configure_optimizers() behavior
  • Note that this requires users to specify both optimizer and scheduler to have an effective optimization configuration, otherwise it shall fallback to the default option.

This shall deprecate the earlier work introduced in: #998

Copy link

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB

@jmoralez
Copy link
Member

Hey @JQGoh, thanks a lot for trying this approach! Although I think it's fine to add more arguments for now, we can give this a shot when we work towards v2.0.

@JQGoh
Copy link
Contributor Author

JQGoh commented May 23, 2024

Hey @JQGoh, thanks a lot for trying this approach! Although I think it's fine to add more arguments for now, we can give this a shot when we work towards v2.0.

In that case, I shall put this on hold then.

@JQGoh JQGoh force-pushed the feat/modify-config-optimizers branch from 3924876 to eca5d9b Compare December 11, 2024 17:53
nbdev_clean --clear_all

remove unnecessary changes
@JQGoh JQGoh force-pushed the feat/modify-config-optimizers branch from eca5d9b to 2e3d7c6 Compare December 11, 2024 17:57
@JQGoh JQGoh marked this pull request as ready for review December 11, 2024 18:51
@JQGoh
Copy link
Contributor Author

JQGoh commented Dec 11, 2024

@jmoralez @elephaint @marcopeix
Please review. After the first round of review, I can improve it with raising a warning when set_configure_optimizers will suggest the fallback option.

@JQGoh
Copy link
Contributor Author

JQGoh commented Dec 13, 2024

This breaking change shall fix #1096

@@ -69,7 +69,7 @@ nbdev_export
If you're working on the local interface you can just use `nbdev_test --n_workers 1 --do_print --timing`.

### Cleaning notebooks
Since the notebooks output cells can vary from run to run (even if they produce the same outputs) the notebooks are cleaned before committing them. Please make sure to run `nbdev_clean --clear_all` before committing your changes. If you clean the library's notebooks with this command please backtrack the changes you make to the example notebooks `git checkout nbs/examples`, unless you intend to change the examples.
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Revise this outdated path

@JQGoh
Copy link
Contributor Author

JQGoh commented Dec 22, 2024

@jmoralez I have modified it based on your suggestion.

@JQGoh
Copy link
Contributor Author

JQGoh commented Jan 28, 2025

@jmoralez

Thanks for your tips on using a subclass approach to customize the default optimizer behavior. Moving forward I can see a few practical ways to support the NeuralForecast users regarding this.

Options

  1. We do not deprecate the already introduced parameters optimizer, optimizer_kwargs, lr_scheduler, lr_scheduler_kwargs. However, if users want to use ReduceLROnPlateau which requires customization of the monitor parameter within the custom_optimizer, we can educate NeuralForecast users by demonstrating an example in the documentation. See an example of how to customize the optimizer via subclass https://gist.github.com/JQGoh/1f9fe1613bdc6380b4c68afddec00b91.

  2. We deprecate the already introduced parameters optimizer, optimizer_kwargs, lr_scheduler, lr_scheduler_kwargs. However, we introduce a new parameter config_optimizers which can allow the users to specify the desired function customizing the optimization behavior, see this commit
    3bab7c0

  3. We deprecate the already introduced parameters optimizer, optimizer_kwargs, lr_scheduler, lr_scheduler_kwargs and we no longer introduce any new parameter. But this requires the users to customize the optimizer behavior via subclass approach https://gist.github.com/JQGoh/1f9fe1613bdc6380b4c68afddec00b91

Options 2 & 3 shall introduce a breaking change and require us to update the NeuralForecast to version 3.0.0. Personally, I prefer the option 1 but I would like to hear more the feedback from the Nixtla's team regarding this.

@jmoralez
Copy link
Member

Thanks. Option 1 sounds good.

@JQGoh JQGoh changed the title [FEAT] Add option to modify the default configure_optimizers() behavior [FEAT] Add example of modifying the default configure_optimizers() behavior (use of ReduceLROnPlateau scheduler) Jan 28, 2025
@JQGoh
Copy link
Contributor Author

JQGoh commented Jan 28, 2025

@jmoralez please review. If you agree that this shall resolve #1096, please help to close that

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants