You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
First of all, thank for sharing your code. And I have succeed replay your work on resnet and densenet. But I have meet some trouble on replaying Bert.
I conda the environment based on requirment.txt, but I cannot import AutoAdapterModel from transformers. But when I install the lib named "adapter-transformers" and import it from adapters, it works.
I have downloaded the weight of bert-base and distill-bert from huggingface "https://huggingface.co/distilbert/distilbert-base-uncased/tree/main". But when I run this code with MNLI datasets, there is a bug in ./utils/hypernetwork/graph/_named_modules. It always print len of self.model.named_parameters() is not equal to len(modules). I have debuged this issue, and I find that self.model.named_parameters() always lack of one layer such as heads.defalut.0.weight or heads.defalut.3.weight. But I cannot fix it. So, I hope the authors can help me to fix it.
Really thanks. 🤝🤝🤝
The text was updated successfully, but these errors were encountered:
First of all, thank for sharing your code. And I have succeed replay your work on resnet and densenet. But I have meet some trouble on replaying Bert.
Really thanks. 🤝🤝🤝
The text was updated successfully, but these errors were encountered: