-
Notifications
You must be signed in to change notification settings - Fork 1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow safetensors offload #873
Conversation
The documentation is not available anymore as the PR was closed or merged. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks easy enough! Just one comment to make sure we add a safetensors
check and it should be added to the test requirements most likely 😄
src/accelerate/utils/offload.py
Outdated
if weight_info.get("safetensors_file") is not None: | ||
from safetensors import safe_open |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We need a guard here for is_safetensors_available
and raise an error if not :) (aka make one :) )
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Indeed, will clean that up on Monday!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🔥
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great! Thanks for fixing the nits :)
This PR reworks a bit the internals of the offload hooks to allow for the case where the checkpoint is saved with
safetensors
. In this case, there is no need to have a separate save of the weight as a Numpy memory-mapped array, as we can directly access the weight usingsafe_open
.Goes along with huggingface/transformers#20321