You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
First of all thank you for your wonderful work
I was wandring How can I take the pretrained weights of the model that you currently have of clip4str and finetune with it with my own dataset (Like Parseq)
I tried loading the clip4str_base16x16_d70bde1f2d.ckpt with clip4str_base16x16_d70bde1f2d.bin of open_clip But I get the error loaded state dict contains a parameter group that doesn't match the size of optimizer's group
any help with it
The text was updated successfully, but these errors were encountered:
Well, the released ckpt does not contain the optimizer's status, if u directly load the ckpt with the pytorch_lightning style code, it may raise errors.
I think you can manually load the weights from the ckpt after the VL4STR class creates the torch.modules, right after this line of code
First of all thank you for your wonderful work
I was wandring How can I take the pretrained weights of the model that you currently have of clip4str and finetune with it with my own dataset (Like Parseq)
I tried loading the clip4str_base16x16_d70bde1f2d.ckpt with clip4str_base16x16_d70bde1f2d.bin of open_clip But I get the error loaded state dict contains a parameter group that doesn't match the size of optimizer's group
any help with it
The text was updated successfully, but these errors were encountered: