-
Notifications
You must be signed in to change notification settings - Fork 27.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Remove quantization related config from dequantized model #34856
Remove quantization related config from dequantized model #34856
Conversation
del model.config.quantization_config | ||
del model.config._pre_quantization_dtype | ||
model.is_quantized = False |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Quantizer seems better place for it than PreTrainedModel.
Not sure if that's 100% exhaustive, but it was enough to solve my issue.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice, sounds good to me !
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice ! Could you add a test in the bnb tests to check if it works as expected ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
makes sense, thanks for updating
…e#34856) * Remove quantization related config from dequantized model * Fix whitespace
…e#34856) * Remove quantization related config from dequantized model * Fix whitespace
What does this PR do?
Fixes #34847
Before submitting
Pull Request section?
to it if that's the case.
documentation guidelines, and
here are tips on formatting docstrings.
Who can review?
@SunMarc