-
Notifications
You must be signed in to change notification settings - Fork 27.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ValueError: Found optimizer
configured in the DeepSpeed config, but no scheduler
. Please configure a scheduler in the DeepSpeed config.
#24359
Comments
Hi @luohao123, So that we can help you, could you follow the issue template and provide a minimal code snippet to reproduce the error and the running environment: run cc @pacman100 |
TLDR; if you're in a rush, downgrading to version I've had the same issue 👇 i.e.
whereas before you could just ignore the first column and leave it blank to get the same result
personally, I found it handier before where I only had to specify the scheduler in one place rather than tracking this over a DeepSpeed config and a Trainer config which are generally separate objects. |
Hello, the supported combinations now are:
@luohao123, the case you want is DeepSpeed Optimizer + Trainer Scheduler which isn't supported now. The suggested approach in your case would be to use Hope this helps. |
@pacman100 I actually got some errors when specifci via trainingargs with cosine scheduler while not specific in deepspeed config:
Which is not right, A100, can u take a llok? this is my ds config:
this is my training args:
what did wrong??? |
Hello @luohao123, please provide minimal reproducible example for further deep dive. Things work fine for me with official example: ds config:
Command:
output logs:
|
@pacman100 thank u, let me try your config and have a test again, I notice your config are not exactly as mine. |
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread. Please note that issues that do not follow the contributing guidelines are likely to be ignored. |
Hi, I want to know if I use setting 1, will the optimizer utilize DeepSpeed's cpuAdam? |
Yes, by default |
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread. Please note that issues that do not follow the contributing guidelines are likely to be ignored. |
I'm trying to use DeepSpeed optimizer + Trainer scheduler because DeepSpeed has the most best optimizer (fused Adam) and Trainer has the best scheduler for my use case (cosine). DeepSpeed does not support cosine. Why was |
Hello @michaelroyzen, the PRs #25863 and huggingface/accelerate#1909 should bring back the support for |
Seems to work well so far @pacman100. Thanks! |
Hi @pacman100 , I still observe the following error, despite using the |
Hello @awasthiabhijeet, it should be part of the latest release, could you recheck it? |
Thanks, @pacman100 :) |
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread. Please note that issues that do not follow the contributing guidelines are likely to be ignored. |
ValueError: Found
optimizer
configured in the DeepSpeed config, but noscheduler
. Please configure a scheduler in the DeepSpeed config.Am using
--warmup_ratio 0.03 --lr_scheduler_type "cosine" \
Here, and I didn't found a properly shceduler in deepspeed ssame as cosine, what should to set?
The text was updated successfully, but these errors were encountered: