Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Failing test: X-Pack API Integration Tests.x-pack/test/api_integration/apis/ml/modules/index·ts - apis Machine Learning modules "after all" hook in "modules" #102283

Closed
kibanamachine opened this issue Jun 16, 2021 · 47 comments · Fixed by #102477, #109471 or #119322
Assignees
Labels
failed-test A test failure on a tracked branch, potentially flaky-test :ml

Comments

@kibanamachine
Copy link
Contributor

kibanamachine commented Jun 16, 2021

A test failed on a tracked branch

Error: expected 200 "OK", got 400 "Bad Request"
    at Test._assertStatus (/dev/shm/workspace/parallel/17/kibana/node_modules/supertest/lib/test.js:268:12)
    at Test._assertFunction (/dev/shm/workspace/parallel/17/kibana/node_modules/supertest/lib/test.js:283:11)
    at Test.assert (/dev/shm/workspace/parallel/17/kibana/node_modules/supertest/lib/test.js:173:18)
    at assert (/dev/shm/workspace/parallel/17/kibana/node_modules/supertest/lib/test.js:131:12)
    at /dev/shm/workspace/parallel/17/kibana/node_modules/supertest/lib/test.js:128:5
    at Test.Request.callback (/dev/shm/workspace/parallel/17/kibana/node_modules/supertest/node_modules/superagent/lib/node/index.js:718:3)
    at /dev/shm/workspace/parallel/17/kibana/node_modules/supertest/node_modules/superagent/lib/node/index.js:906:18
    at IncomingMessage.<anonymous> (/dev/shm/workspace/parallel/17/kibana/node_modules/supertest/node_modules/superagent/lib/node/parsers/json.js:19:7)
    at endReadableNT (internal/streams/readable.js:1336:12)
    at processTicksAndRejections (internal/process/task_queues.js:82:21)

First failure: Jenkins Build

@kibanamachine kibanamachine added the failed-test A test failure on a tracked branch, potentially flaky-test label Jun 16, 2021
@botelastic botelastic bot added the needs-team Issues missing a team label label Jun 16, 2021
@kibanamachine
Copy link
Contributor Author

New failure: Jenkins Build

@kibanamachine
Copy link
Contributor Author

New failure: Jenkins Build

1 similar comment
@kibanamachine
Copy link
Contributor Author

New failure: Jenkins Build

@kibanamachine
Copy link
Contributor Author

New failure: Jenkins Build

@kibanamachine
Copy link
Contributor Author

New failure: Jenkins Build

1 similar comment
@kibanamachine
Copy link
Contributor Author

New failure: Jenkins Build

@kibanamachine
Copy link
Contributor Author

New failure: Jenkins Build

1 similar comment
@kibanamachine
Copy link
Contributor Author

New failure: Jenkins Build

@kibanamachine
Copy link
Contributor Author

New failure: Jenkins Build

1 similar comment
@kibanamachine
Copy link
Contributor Author

New failure: Jenkins Build

@jportner jportner added the :ml label Jun 16, 2021
@elasticmachine
Copy link
Contributor

Pinging @elastic/ml-ui (:ml)

@botelastic botelastic bot removed the needs-team Issues missing a team label label Jun 16, 2021
tylersmalley added a commit that referenced this issue Jun 16, 2021
@tylersmalley
Copy link
Contributor

tylersmalley commented Jun 16, 2021

Skipped

image

master: 3236f3f

@kibanamachine
Copy link
Contributor Author

New failure: Jenkins Build

@kibanamachine kibanamachine reopened this Aug 19, 2021
@kibanamachine
Copy link
Contributor Author

New failure: Jenkins Build

@kibanamachine
Copy link
Contributor Author

New failure: Jenkins Build

@mistic mistic closed this as completed Aug 19, 2021
@kibanamachine kibanamachine reopened this Aug 19, 2021
@kibanamachine kibanamachine reopened this Oct 27, 2021
@kibanamachine
Copy link
Contributor Author

New failure: CI Build - 8.0

@kibanamachine
Copy link
Contributor Author

New failure: CI Build - 8.0

@joshdover
Copy link
Contributor

Appears to be same production issue with EPR as I commented on for this test: #116522 (comment)

@kibanamachine
Copy link
Contributor Author

New failure: CI Build - 8.0

@kibanamachine
Copy link
Contributor Author

New failure: CI Build - 8.0

@kibanamachine
Copy link
Contributor Author

New failure: CI Build - 8.0

@kibanamachine
Copy link
Contributor Author

New failure: CI Build - 8.0

@kibanamachine
Copy link
Contributor Author

New failure: CI Build - 8.0

@kibanamachine
Copy link
Contributor Author

New failure: CI Build - 8.0

@kibanamachine
Copy link
Contributor Author

New failure: CI Build - 8.0

jbudz added a commit that referenced this issue Oct 28, 2021
@jbudz
Copy link
Member

jbudz commented Oct 28, 2021

skipped

8.0: 714b007

@kibanamachine
Copy link
Contributor Author

New failure: CI Build - 8.0

@kibanamachine
Copy link
Contributor Author

New failure: CI Build - 8.0

@kibanamachine
Copy link
Contributor Author

New failure: CI Build - 8.0

@spalger
Copy link
Contributor

spalger commented Nov 11, 2021

Supposed to be closed by #117519, but wasn't. @pheyos mind taking a look? Looks like this is still somewhat flaky

@kibanamachine
Copy link
Contributor Author

New failure: CI Build - 7.15

@pheyos
Copy link
Member

pheyos commented Nov 22, 2021

In the 8.0 test execution logs we have:

[ERROR][plugins.fleet] '502 Bad Gateway' error response from package registry at https://epr-staging.elastic.co/search?experimental=true

and in the 7.15 one:

[error][fleet][plugins] '502 Bad Gateway' error response from package registry at https://epr-staging.elastic.co/search?experimental=true&kibana.version=7.15.3

While things are mostly working fine, it seems that sometimes there are issues with the Fleet package repository.
I'll see if we can wrap the package version fetching into a retry, so we can remedy short-living issues with the package repository.

tylersmalley pushed a commit that referenced this issue Dec 7, 2022
Signed-off-by: Tyler Smalley <[email protected]>
tylersmalley pushed a commit that referenced this issue Dec 7, 2022
Signed-off-by: Tyler Smalley <[email protected]>
jbudz pushed a commit that referenced this issue Dec 8, 2022
Signed-off-by: Tyler Smalley <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
failed-test A test failure on a tracked branch, potentially flaky-test :ml
Projects
None yet
9 participants