You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In Pytorch you can convert model to fp16 with module.half() call. I think it should be called before converting to TorchScript. See docs https://pytorch.org/docs/stable/generated/torch.nn.Module.html. I think it should be quite simple implementation.
The text was updated successfully, but these errors were encountered:
Yes, it is expected. Since x86 doesn't support fp16 natively, nobody cared about this on CPU. However Altra supports it natively, so when eager mode will be implemented I expect it to work.
In Pytorch you can convert model to fp16 with
module.half()
call. I think it should be called before converting to TorchScript. See docs https://pytorch.org/docs/stable/generated/torch.nn.Module.html. I think it should be quite simple implementation.The text was updated successfully, but these errors were encountered: