-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bump Jasmine timeout to hopefully make CI more stable.. #7713
Conversation
Thanks for the pull request @mramato!
Reviewers, don't forget to make sure that:
|
@OmarShehata these all look like legitimate test failures in blob related code (which I know changed slightly recently with image bitmap stuff). Can you take a look? |
I could not reproduce this locally with any browser. It was also fixed with a Travis restart. This is strange because it's not like the image is failing to load, it's just loading a 16x16 image instead of the 1x1 requested in that test? Most of our tests here are either 16x16 or 1x1 images, so, some kind of obscure race condition? I can't think of what bizarre scenario would cause independent promises to interfere like that. |
I understand that, but there's clearly something going on or the test wouldn't have failed |
This is true. It's different from other erratic failures in that it's not a timeout. My other guess was that the first image fetch is going to fetch a 1x1 image to test I've posted a note here so we can look into it more #7249 (comment). I don't think this is caused by this PR. |
@OmarShehata the above tests failed again in an unrelated PR: https://travis-ci.org/AnalyticalGraphicsInc/cesium/builds/517293290?utm_source=github_status&utm_medium=notification There's definitely something going on in those tests that the build server doesn't like. |
Thanks, I noted the exact failing tests down in the erratic test failure issue. We have test randomization disabled right? Do the tests run in parallel? |
Randomization is disabled. |
Thanks again for your contribution @mramato! No one has commented on this pull request in 30 days. Maintainers, can you review, merge or close to keep things tidy? I'm going to re-bump this in 30 days. If you'd like me to stop, just comment with |
3 similar comments
Thanks again for your contribution @mramato! No one has commented on this pull request in 30 days. Maintainers, can you review, merge or close to keep things tidy? I'm going to re-bump this in 30 days. If you'd like me to stop, just comment with |
Thanks again for your contribution @mramato! No one has commented on this pull request in 30 days. Maintainers, can you review, merge or close to keep things tidy? I'm going to re-bump this in 30 days. If you'd like me to stop, just comment with |
Thanks again for your contribution @mramato! No one has commented on this pull request in 30 days. Maintainers, can you review, merge or close to keep things tidy? I'm going to re-bump this in 30 days. If you'd like me to stop, just comment with |
We are seeing a lot of random CI failures, so I'm going to merge this just to rule out time-out related issues (travis itself seems to have gotten slower and more performance prone in the last few months). |
I suspect a lot of the random CI timeouts we've been seeing are just travis being overly slow. This removes that possibility by making the timeout 30 seconds. This may result in some long passing tests, but that's a much better problem to have than constant random CI failures (plus the long test reporter should point them out).