-
Notifications
You must be signed in to change notification settings - Fork 908
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
(sd3-flux) "NotImplementedError: Cannot copy out of meta tensor; no data!" when trying to train LoRA #1454
Comments
Please use fp16 version of the weights for flux1 dev and t5xxl. |
Are there any plans to support optimized models in the future? |
From my understanding, it is not good to use a quantized model as a base model for training in terms of quality. Currently, the script uses float8_e4m3fnwhen specifying |
So, here are my current settings:
I'm using gobs of system memory, to the point where training a lora will take days. Is there anything else I can do to fit into 24G of VRAM? |
|
Okay, so I've tried enabling the fp8 base option (both with and without fp16 training, and using both the fp16 and fp8 versions of the model). Memory usage is fine, but when it starts to train, I get this error:
My latest settings are here:
|
Please add |
Same error. Here are my settings, in case I messed something else up while I was fiddling with it:
Apparently somebody had the same error using ComfyUI: I'm downloading different versions of t5 and clip, because maybe those are the problem. I'll let you know how it goes. |
Update: No dice. Got the same error... here are my settings:
|
Why use "sdpa" instead of "xformers"? Thanks. |
Any chance somebody could post some working 3090/4090 settings and I could just go from there? :) |
@envy-ai Have you updated PyTorch to 2.4.0 (and torchvision)? If not, please follow the instruction on README: https://github.com/kohya-ss/sd-scripts/tree/sd3 |
Because FLUX.1 models don't support xformers yet. Even if you specify |
@kohya-ss I tried that just now, and I'm still getting the same error.
current settings, in case I changed them since the last one:
|
Please use |
That was it! Thank you, and sorry I missed that the first time around. Not only is it working, it significantly reduced VRAM usage. |
I've got an RTX 4090 and I'm running the latest commit of the kohya_ss sd3-flux branch on Windows.
Here is my configuration:
Here's the resulting error when I run it:
Any idea how I can get this to work?
The text was updated successfully, but these errors were encountered: