-
Notifications
You must be signed in to change notification settings - Fork 27.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix is_torch_xpu_available for torch < 2.3 #31573
Fix is_torch_xpu_available for torch < 2.3 #31573
Conversation
@@ -754,11 +754,13 @@ def is_torch_xpu_available(check_device=False): | |||
if not is_torch_available(): | |||
return False | |||
|
|||
import torch | |||
|
|||
torch_version = version.parse(_torch_version) | |||
if is_ipex_available(): | |||
import intel_extension_for_pytorch # noqa: F401 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Previously, if is_ipex_available
evaluated to False
we would return False
. #31238 enables using pytorch for XPU as well, if the installed version >= 2.4.
Here, we still return False
if the installed version < 2.4 and IPEX is not available in the environment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks.
Could also be fixed by change except RuntimeError
to except RuntimeError, AttributeError
, but both works for me and leave you to make the call.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'll leave like this, as it makes the intended behaviour a bit more explicit imo :)
29a2593
to
2b2b4d0
Compare
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
😰
What does this PR do?
A recent PR #31238 updated the
is_torch_xpu_available
check to reflect XPU support from Pytorch >= 2.4. However, this broke backwards compatibility with versions of pytorch < 2.4 whenis_ipex_available
isFalse
andcheck_device=True
.Technically, this fails if the installed torch in the environment is <= 2.2, and 2.3 will still pass, as torch 2.3 has the
torch.xpu
module. However, as the intended logic is to default to the pytorch available logic for >= 2.4 and previously this evaluated to False for 2.3, this is more backwards compatible.Fixes #31563