Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bump Jasmine timeout to hopefully make CI more stable.. #7713

Merged
merged 1 commit into from
Aug 15, 2019
Merged

Conversation

mramato
Copy link
Contributor

@mramato mramato commented Apr 4, 2019

I suspect a lot of the random CI timeouts we've been seeing are just travis being overly slow. This removes that possibility by making the timeout 30 seconds. This may result in some long passing tests, but that's a much better problem to have than constant random CI failures (plus the long test reporter should point them out).

@cesium-concierge
Copy link

Thanks for the pull request @mramato!

  • ✔️ Signed CLA found.
  • CHANGES.md was not updated.
    • If this change updates the public API in any way, please add a bullet point to CHANGES.md.

Reviewers, don't forget to make sure that:

  • Cesium Viewer works.
  • Works in 2D/CV.
  • Works (or fails gracefully) in IE11.

@mramato
Copy link
Contributor Author

mramato commented Apr 4, 2019

@OmarShehata these all look like legitimate test failures in blob related code (which I know changed slightly recently with image bitmap stuff). Can you take a look?

@OmarShehata
Copy link
Contributor

I could not reproduce this locally with any browser. It was also fixed with a Travis restart. This is strange because it's not like the image is failing to load, it's just loading a 16x16 image instead of the 1x1 requested in that test? Most of our tests here are either 16x16 or 1x1 images, so, some kind of obscure race condition? I can't think of what bizarre scenario would cause independent promises to interfere like that.

@mramato
Copy link
Contributor Author

mramato commented Apr 4, 2019

I understand that, but there's clearly something going on or the test wouldn't have failed

@OmarShehata
Copy link
Contributor

This is true. It's different from other erratic failures in that it's not a timeout. My other guess was that the first image fetch is going to fetch a 1x1 image to test createImageBitmap support, and that may be interfering with it, but the reported failure was the opposite issue. It was expecting a 1x1 and getting a 16x16.

I've posted a note here so we can look into it more #7249 (comment). I don't think this is caused by this PR.

@mramato
Copy link
Contributor Author

mramato commented Apr 8, 2019

@OmarShehata the above tests failed again in an unrelated PR: https://travis-ci.org/AnalyticalGraphicsInc/cesium/builds/517293290?utm_source=github_status&utm_medium=notification

There's definitely something going on in those tests that the build server doesn't like.

@OmarShehata
Copy link
Contributor

Thanks, I noted the exact failing tests down in the erratic test failure issue.

We have test randomization disabled right? Do the tests run in parallel?

@mramato
Copy link
Contributor Author

mramato commented Apr 9, 2019

Randomization is disabled.
Tests do not run in parallel.

@cesium-concierge
Copy link

Thanks again for your contribution @mramato!

No one has commented on this pull request in 30 days. Maintainers, can you review, merge or close to keep things tidy?

I'm going to re-bump this in 30 days. If you'd like me to stop, just comment with @cesium-concierge stop. If you want me to start again, just delete the comment.

3 similar comments
@cesium-concierge
Copy link

Thanks again for your contribution @mramato!

No one has commented on this pull request in 30 days. Maintainers, can you review, merge or close to keep things tidy?

I'm going to re-bump this in 30 days. If you'd like me to stop, just comment with @cesium-concierge stop. If you want me to start again, just delete the comment.

@cesium-concierge
Copy link

Thanks again for your contribution @mramato!

No one has commented on this pull request in 30 days. Maintainers, can you review, merge or close to keep things tidy?

I'm going to re-bump this in 30 days. If you'd like me to stop, just comment with @cesium-concierge stop. If you want me to start again, just delete the comment.

@cesium-concierge
Copy link

Thanks again for your contribution @mramato!

No one has commented on this pull request in 30 days. Maintainers, can you review, merge or close to keep things tidy?

I'm going to re-bump this in 30 days. If you'd like me to stop, just comment with @cesium-concierge stop. If you want me to start again, just delete the comment.

@mramato
Copy link
Contributor Author

mramato commented Aug 15, 2019

We are seeing a lot of random CI failures, so I'm going to merge this just to rule out time-out related issues (travis itself seems to have gotten slower and more performance prone in the last few months).

@mramato mramato merged commit 445d157 into master Aug 15, 2019
@mramato mramato deleted the ci-timeout branch August 15, 2019 12:39
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants