Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Properly fix W/A from #240, commit ea710fe - The "Forward Inference RNN vanilla" sub-test fails too often #1177

Open
Tracked by #1147
atamazov opened this issue Sep 24, 2021 · 3 comments

Comments

@atamazov
Copy link
Contributor

Observed with ROCm3.3, Radeon VII (Vega20): Many test cases always fail with: "Forward Inference RNN vanilla: Output tensor output failed verification."

Workaround is introduced in ea710fe. It is still there.

Let's revert the W/A, check if problem still persists and provide a proper fix.

@atamazov
Copy link
Contributor Author

Please find more info at #240 (comment)

junliume added a commit that referenced this issue Oct 15, 2021
@ppanchad-amd
Copy link

@atamazov Is this ticket still relevant? Thanks!

@atamazov
Copy link
Contributor Author

@ppanchad-amd I think so.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants