Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Should we close DDP accelerator on while training with 1 GPU? #62

Open
aartykov opened this issue Jan 30, 2023 · 2 comments
Open

Should we close DDP accelerator on while training with 1 GPU? #62

aartykov opened this issue Jan 30, 2023 · 2 comments

Comments

@aartykov
Copy link

Hi,

Should we set the " trainer_config["accelerator"] = "ddp" " part to None while training with only 1 GPU?

@WhiteZz1
Copy link

WhiteZz1 commented Feb 3, 2023

I ran the main.py script in only one GPU environment, and the following error occurred
ValueError: You selected an invalid accelerator name: accelerator='ddp'. Available names are: cpu, cuda, hpu, ipu, mps, tpu.

Are our problems the some ? @artykov1511 @justinpinkney

@justinpinkney
Copy link
Owner

hmmm, this seems like maybe a pytorch lightning version issue? I used ddp should work fine for 1 gpu on the version i used

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants