enable_input_require_grads
is called twice with use_gradient_checkpointing=True
and bnb quantisation; is there any effect?
#1623
-
As in the quantization tutorial, we need to call the following lines to use peft with quantization. from peft import prepare_model_for_kbit_training
model = prepare_model_for_kbit_training(model) In the method with Lines 111 to 122 in 8452d71 Then almost same logic is executed in |
Beta Was this translation helpful? Give feedback.
Answered by
BenjaminBossan
Apr 5, 2024
Replies: 1 comment 4 replies
-
Yes, calling this twice should not change the result. |
Beta Was this translation helpful? Give feedback.
4 replies
Answer selected by
nzw0301
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Yes, calling this twice should not change the result.