You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have something wrong:
Traceback (most recent call last):
File "G:\MagicClothing\inference.py", line 38, in
full_net = ClothAdapter(pipe, args.model_path, device, args.enable_cloth_guidance, False)
File "G:\MagicClothing\garment_adapter\garment_diffusion.py", line 31, in init
with safe_open(ref_path, framework="pt", device="cpu") as f:
safetensors_rust.SafetensorError: Error while deserializing header: HeaderTooLarge
I wonder where is "cloth_segm.pth" can be downloaded,which model should be used.
I've downloaded these:
ip-adapter-faceid-plusv2_sdxl_lora.safetensors
ip-adapter-faceid-plus_sd15.bin
ip-adapter-faceid-plusv2_sdxl.bin
ip-adapter-faceid-portrait_sd15.bin
The text was updated successfully, but these errors were encountered:
I have something wrong: Traceback (most recent call last): File "G:\MagicClothing\inference.py", line 38, in full_net = ClothAdapter(pipe, args.model_path, device, args.enable_cloth_guidance, False) File "G:\MagicClothing\garment_adapter\garment_diffusion.py", line 31, in init with safe_open(ref_path, framework="pt", device="cpu") as f: safetensors_rust.SafetensorError: Error while deserializing header: HeaderTooLarge
I wonder where is "cloth_segm.pth" can be downloaded,which model should be used. I've downloaded these:
I have something wrong:
Traceback (most recent call last):
File "G:\MagicClothing\inference.py", line 38, in
full_net = ClothAdapter(pipe, args.model_path, device, args.enable_cloth_guidance, False)
File "G:\MagicClothing\garment_adapter\garment_diffusion.py", line 31, in init
with safe_open(ref_path, framework="pt", device="cpu") as f:
safetensors_rust.SafetensorError: Error while deserializing header: HeaderTooLarge
I wonder where is "cloth_segm.pth" can be downloaded,which model should be used.
I've downloaded these:
The text was updated successfully, but these errors were encountered: