multi GPU, where every gpu is different #1556
Unanswered
draganjovanovich
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I tried using axolotl with deepspeed to FFT llama3 8B on machine that has 1xA100 40GB and 1x4090 24GB
And it always uses only one GPU, is there some condition, that GPUs must be equal in VRAM, or they even must be identical models?
Beta Was this translation helpful? Give feedback.
All reactions