-
Notifications
You must be signed in to change notification settings - Fork 381
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to Use Different Optimizers with NeuralForecast Models #852
Comments
Adam is the optimizer that is already being used |
How do I use a different optimizer other than Adam? |
Hi @Amirh63. There is currently no option to modify the optimizer using the hyperparameters. We fixed the optimizer to Adam, given that all papers proposing these models use it. But this can be a great addition, so we will add it to our future improvements. In the meantime, you can modify it manually if you clone the repo and modify the following: In line 181 on _base_windows.py:
|
@quest-bot stash 200 |
|
Solution:
|
Hi @cchallu , can you make scheduler an option to? |
@quest-bot embark |
Hi @cchallu If you agree that |
@quest-bot reward @JQGoh |
|
Description
Greetings Gentlemen,
I am currently working with the NeuralForecast library and specifically interested in training models using the Adam optimizer. I've gone through the documentation but haven't found clear guidance on how to integrate a custom optimizer, like Adam, into the training process.
Could you please provide some insights or examples on how to set up the Adam optimizer for training NeuralForecast models? I'm particularly interested in any necessary configurations or modifications needed within the library's framework to achieve this.
Additionally, if there are any best practices or recommendations for using custom optimizers with NeuralForecast, that information would be greatly appreciated.
Thank you for your time and assistance.
Link
No response
The text was updated successfully, but these errors were encountered: