We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Observed with ROCm3.3, Radeon VII (Vega20): Many test cases always fail with: "Forward Inference RNN vanilla: Output tensor output failed verification."
Workaround is introduced in ea710fe. It is still there.
Let's revert the W/A, check if problem still persists and provide a proper fix.
The text was updated successfully, but these errors were encountered:
Please find more info at #240 (comment)
Sorry, something went wrong.
remove W/A mentioned in #1177
f7f7ccd
@atamazov Is this ticket still relevant? Thanks!
@ppanchad-amd I think so.
No branches or pull requests
Observed with ROCm3.3, Radeon VII (Vega20): Many test cases always fail with: "Forward Inference RNN vanilla: Output tensor output failed verification."
Workaround is introduced in ea710fe. It is still there.
Let's revert the W/A, check if problem still persists and provide a proper fix.
The text was updated successfully, but these errors were encountered: