Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Finetune CLIP4STR #28

Open
Dordor333 opened this issue Oct 20, 2024 · 1 comment
Open

Finetune CLIP4STR #28

Dordor333 opened this issue Oct 20, 2024 · 1 comment

Comments

@Dordor333
Copy link

First of all thank you for your wonderful work
I was wandring How can I take the pretrained weights of the model that you currently have of clip4str and finetune with it with my own dataset (Like Parseq)
I tried loading the clip4str_base16x16_d70bde1f2d.ckpt with clip4str_base16x16_d70bde1f2d.bin of open_clip But I get the error loaded state dict contains a parameter group that doesn't match the size of optimizer's group
any help with it

@mzhaoshuai
Copy link
Contributor

mzhaoshuai commented Oct 21, 2024

Well, the released ckpt does not contain the optimizer's status, if u directly load the ckpt with the pytorch_lightning style code, it may raise errors.

I think you can manually load the weights from the ckpt after the VL4STR class creates the torch.modules, right after this line of code

self.perm_mirrored = perm_mirrored

or, after this line of code:
https://github.com/VamosC/CLIP4STR/blob/ef592fed6e6422964cc5e6f792035e47fdc2fe2d/train.py#L69C9-L69C14

Do something like https://pytorch.org/tutorials/recipes/recipes/warmstarting_model_using_parameters_from_a_different_model.html,

Just loop over the loaded state_dict and assign its values to the newly created model parameters with the same keys.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants