-
Notifications
You must be signed in to change notification settings - Fork 91
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Some Focus Area percentages on dashboard don't match the results percentage #4107
Comments
Assign @DanielRyanSmith just so this issue has an owner. Feel free to discuss this in our meeting! |
Any movement on this @DanielRyanSmith? Thanks. |
Edit: The below hypotheses of the problem were incorrect, but I'll leave this comment around if, for some reason, people are interested in the steps I took to investigate.Sorry, I missed this until now - I'm not quite sure immediately what is happening here yet, since safari does not have any flaky tests that might cause this (that was my initial guess). I see Safari with a 99.7% on the dashboard and a 99.9% on the results page. If I had to guess again, it might be the fact that some tests/subtests under the label "interop-2024-accessibility" were added/removed at some point through the year, and the interop score aggregation script is using an incorrect number of tests to calculate the passing percentage. Edit: Looking at older runs (back from March) onward, it does seem that the amount of subtests related to interop-2024-accessibility shrinks from 1133 down to 1095, then back up to 1112. I have to imagine that this is the problem with the dashboard's score, as it's probably using tests/subtests that have been removed from the category to aggregate score totals. |
Thanks for digging in... The total number has fluctuated a bit as invalid or disputed tests were removed or modified. There were quite a few that were erroneously added midyear too, and if any of those caused new failures in the engines we moved some of those to |
So I took a deeper look into this and it turns out that the interop dashboard score is definitely the more correct score, and the discrepancy here is due to how the rounding was handled on the test results page. I've made an extremely simple fix at #4142, which has an explanation of the problem. |
@DanielRyanSmith Original cases linked above still show a discrepancy. Albeit a smaller one. 99.7 to 99.8 instead of the original 99.7 to 99.9. |
I don’t have a way to reopen the issue. Would you prefer a new one? |
Hmm, this is still the same issue, but my quick change did not match the approach for the interop score aggregation script, which does no rounding and is the intended result. The test results page is still rounding up, just a little less. I'll readjust things. Sorry about that! The dashboard score is still correct here. |
As a follow-up, this is now in production and the Accessibility scores on the results page look as expected. 🙂 Note that there is a possibility that score discrepancies might pop up at some point, and the likely causes have been documented in these issues:
However, the cause for the original situation described in this issue has been fixed. Thanks for the patience as I investigated here 😊 |
Some Focus Area percentages on dashboard don't match the results percentage
For example, the dashboard lists the Accessibility Focus Area at 97.8% and 99.7% respectively for Firefox and Safari…
But if you click through to the Focus Area results, the scores are 98% and 99.9%. (Bottom row right.)
I haven't dug into which is correct, but it seems clear one of them is wrong, and the dashboard percentages should be consistent throughout.
The text was updated successfully, but these errors were encountered: