Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Workaround transformers overwriting model_type when saving dpr models #765

Merged
merged 15 commits into from
Jun 9, 2021

Conversation

julian-risch
Copy link
Member

For DPR models, transformers overwrites the model_type parameter specified in a model's config when saving the model:
https://github.com/huggingface/transformers/blob/64e78564a519cda2b4408803e2781c604e1e3bdd/src/transformers/configuration_utils.py#L626
For DPR models with a non-BERT tokenizer, e.g., CamembertTokenizer as in haystack issue #1046 this prevents loading the correct tokenizer when the model is loaded again. For example, the model_type camembert is specified in the config when loading the model from the model hub, but the model_type dpr is saved.
To overcome this problem, we set the model_type of transformers.DPRConfig on-the-fly to the model_type as specified in the model's config file.

This PR corresponds to haystack PR #1060

@julian-risch julian-risch changed the title WIP: Workaround transformers overwriting model_type when saving dpr models Workaround transformers overwriting model_type when saving dpr models Jun 7, 2021
@julian-risch julian-risch marked this pull request as ready for review June 7, 2021 21:00
Copy link
Contributor

@Timoeller Timoeller left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice one.

I like the parametrization and documentation of the tests (since they can be quite confusing).
You even fixed the typo in the docs about cty_encoder : )

I added a comment that would make the solution even nicer (I believe). I would be fine with merging already now.

@Timoeller
Copy link
Contributor

❤️ ;)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants