-
Notifications
You must be signed in to change notification settings - Fork 27.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
For xla tensors, use an alternative way to get a unique id #25802
Conversation
Hi @muellerzr moved huggingface/safetensors#349 to here |
LMK if I should add a test (and in which file) I am not sure if the CI machines has torch_xla installed so |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks! Some general feedback on how we check torch_tpu
, cc @amyeroberts
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for this PR!
Changes look OK to me. However, it feels contradictory to add a work around for tensors that don't have storage within a functional called id_tensor_storage
: this function shouldn't be called on those tensors at all.
@muellerzr @qipeng Could you provide some more context for this change? i.e. is this enabling functionality elsewhere?
@amyeroberts this stems from huggingface/safetensors#349, where we specified that it should be done in |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As long as @amyeroberts is happy with the explanation, lg2m
Because xla tensors don't have storage.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for adding this fix!
…ce#25802) * For xla tensors, use an alternative way to get a unique id Because xla tensors don't have storage. * add is_torch_tpu_available check
What does this PR do?
XLA tensors don't have storage and attempting to get it's storage will get
RuntimeError: Attempted to access the data pointer on an
invalid python storage.
Repro:
With this patch it would print out
(device(type='xla', index=0), 1, 400).