Skip to content
This repository has been archived by the owner on Sep 17, 2024. It is now read-only.

chore: refactor Fleet upgrade tests (#671) backport for 6.8.x #699

Closed
wants to merge 2 commits into from

Conversation

mdelapenya
Copy link
Contributor

Backports the following commits to 6.8.x:

* chore: use nightly annotation for the Upgrade tests

* chore: add two make goals for the nightly use cases

- e2e-fleet-nightly: run the nightly tests not using CI snapshots.
It should download the binaries from the official artifactory.
- e2e-fleet-nightly-ci-snapshots: run the nightly tests using the CI
snapshots for a specific SHA commit from Beats, downloading them from
a GCP bucket

* chore: bump elastic-agent stale version

* chore: pass version and state state when creating an installer

This will allow selecting the proper binary, depending if we are using
a stale agent or a regular one.

* fix: append snapshot to the stale version when using CI snapshots

* fix: check for version aliases with non-stale versions

* chore: store current agent version in the test suite struct

* chore: make sure the layout is properly created for TAR installer

* chore: add make goals for testing fleet use cases

* chore: move Make goals to the e2e Makefile
@mdelapenya mdelapenya self-assigned this Feb 4, 2021
@elasticmachine
Copy link
Contributor

💔 Build Failed

the below badges are clickable and redirect to their specific view in the CI or DOCS
Pipeline View Test View Changes Artifacts preview

Expand to view the summary

Build stats

  • Build Cause: Pull request #699 opened

    • Start Time: 2021-02-04T16:16:33.054+0000
  • Duration: 5 min 1 sec

  • Commit: 9646b99

Test stats 🧪

Test Results
Failed 0
Passed 40
Skipped 7
Total 47

Steps errors 2

Expand to view the steps failures

Build and test
  • Took 1 min 31 sec . View more details on here
  • Description: .ci/scripts/build-test.sh
Archive the artifacts
  • Took 0 min 0 sec . View more details on here
  • Description: [2021-02-04T16:21:31.384Z] Archiving artifacts script returned exit code 2

Log output

Expand to view the last 100 lines of log output

[2021-02-04T16:21:18.168Z] PASS config.TestNewConfigPopulatesConfiguration (0.00s)
[2021-02-04T16:21:18.168Z] PASS config
[2021-02-04T16:21:18.429Z] PASS internal.TestClone (0.29s)
[2021-02-04T16:21:18.429Z] PASS internal.TestMkdirAll (0.00s)
[2021-02-04T16:21:18.429Z] PASS internal.TestRecover (0.00s)
[2021-02-04T16:21:18.689Z] PASS internal.TestUpdateCreatesStateFile (0.00s)
[2021-02-04T16:21:18.689Z] PASS internal
[2021-02-04T16:21:18.689Z] PASS cmd.TestSanitizeComposeFile_Multiple (0.00s)
[2021-02-04T16:21:18.689Z] PASS cmd.TestSanitizeComposeFile_Single (0.00s)
[2021-02-04T16:21:18.689Z] PASS cmd
[2021-02-04T16:21:18.689Z] EMPTY docker
[2021-02-04T16:21:18.948Z] PASS services.TestGetBaseURL (0.00s)
[2021-02-04T16:21:18.948Z] PASS services.TestNewClient (0.00s)
[2021-02-04T16:21:18.948Z] PASS services.TestNewKibanaClientWithPathStartingWithSlash (0.00s)
[2021-02-04T16:21:18.948Z] PASS services.TestNewKibanaClientWithPathStartingWithoutSlash (0.00s)
[2021-02-04T16:21:18.948Z] PASS services.TestNewKibanaClientWithMultiplePathsKeepsLastOne (0.00s)
[2021-02-04T16:21:18.948Z] PASS services.TestGetConfigSanitizer (0.00s)
[2021-02-04T16:21:18.948Z] PASS services
[2021-02-04T16:21:18.949Z] 
[2021-02-04T16:21:18.949Z] DONE 28 tests in 7.020s
[2021-02-04T16:21:18.949Z] make: Leaving directory '/var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-699/src/github.com/elastic/e2e-testing/cli'
[2021-02-04T16:21:18.949Z] ++ pwd
[2021-02-04T16:21:18.949Z] + GOTESTSUM_JUNITFILE=/var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-699/src/github.com/elastic/e2e-testing/outputs/TEST-unit-e2e.xml
[2021-02-04T16:21:18.949Z] + make -C e2e unit-test
[2021-02-04T16:21:18.949Z] make: Entering directory '/var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-699/src/github.com/elastic/e2e-testing/e2e'
[2021-02-04T16:21:18.949Z] gotestsum --format testname -- -count=1 -timeout=5m ./...
[2021-02-04T16:21:19.297Z] go: downloading github.com/Jeffail/gabs/v2 v2.5.1
[2021-02-04T16:21:19.297Z] go: downloading github.com/elastic/go-elasticsearch/v8 v8.0.0-20190731061900-ea052088db25
[2021-02-04T16:21:19.297Z] go: downloading gopkg.in/yaml.v3 v3.0.0-20200615113413-eeeca48fe776
[2021-02-04T16:21:19.297Z] go: downloading google.golang.org/genproto v0.0.0-20191108220845-16a3f7862a1a
[2021-02-04T16:21:25.883Z] PASS TestGetBucketSearchNextPageParam_HasMorePages (0.00s)
[2021-02-04T16:21:25.883Z] PASS TestGetBucketSearchNextPageParam_HasNoMorePages (0.00s)
[2021-02-04T16:21:25.883Z] PASS TestProcessBucketSearchPage_PullRequestsFound (0.00s)
[2021-02-04T16:21:25.883Z] PASS TestProcessBucketSearchPage_PullRequestsNotFound (0.00s)
[2021-02-04T16:21:25.883Z] PASS TestProcessBucketSearchPage_SnapshotsFound (0.00s)
[2021-02-04T16:21:25.883Z] PASS TestProcessBucketSearchPage_SnapshotsNotFound (0.00s)
[2021-02-04T16:21:25.883Z] PASS .
[2021-02-04T16:21:25.883Z] 
[2021-02-04T16:21:25.883Z] DONE 6 tests in 6.124s
[2021-02-04T16:21:25.883Z] cd _suites && gotestsum --format testname -- -count=1 -timeout=5m ./...
[2021-02-04T16:21:25.883Z] go: downloading github.com/cucumber/messages-go/v10 v10.0.3
[2021-02-04T16:21:25.883Z] go: downloading github.com/cucumber/godog v0.11.0
[2021-02-04T16:21:25.883Z] go: downloading github.com/gofrs/uuid v3.3.0+incompatible
[2021-02-04T16:21:25.883Z] go: downloading github.com/cucumber/gherkin-go/v11 v11.0.0
[2021-02-04T16:21:25.883Z] go: downloading github.com/hashicorp/go-memdb v1.3.0
[2021-02-04T16:21:25.883Z] go: downloading github.com/hashicorp/go-immutable-radix v1.3.0
[2021-02-04T16:21:25.883Z] go: downloading github.com/hashicorp/golang-lru v0.5.4
[2021-02-04T16:21:27.796Z] # github.com/elastic/e2e-testing/e2e/_suites/fleet [github.com/elastic/e2e-testing/e2e/_suites/fleet.test]
[2021-02-04T16:21:27.796Z] fleet/services_test.go:27:35: not enough arguments in call to downloadAgentBinary
[2021-02-04T16:21:27.796Z] 	have (string, string, string, string, string)
[2021-02-04T16:21:27.796Z] 	want (string, string, string, string, string, bool)
[2021-02-04T16:21:27.796Z] fleet/services_test.go:39:62: not enough arguments in call to downloadAgentBinary
[2021-02-04T16:21:27.796Z] 	have (string, string, string, string, string)
[2021-02-04T16:21:27.796Z] 	want (string, string, string, string, string, bool)
[2021-02-04T16:21:27.796Z] fleet/services_test.go:53:62: not enough arguments in call to downloadAgentBinary
[2021-02-04T16:21:27.796Z] 	have (string, string, string, string, string)
[2021-02-04T16:21:27.796Z] 	want (string, string, string, string, string, bool)
[2021-02-04T16:21:27.796Z] fleet/services_test.go:67:62: not enough arguments in call to downloadAgentBinary
[2021-02-04T16:21:27.796Z] 	have (string, string, string, string, string)
[2021-02-04T16:21:27.796Z] 	want (string, string, string, string, string, bool)
[2021-02-04T16:21:27.796Z] WARN invalid TestEvent: FAIL	github.com/elastic/e2e-testing/e2e/_suites/fleet [build failed]
[2021-02-04T16:21:27.796Z] bad output from test2json: FAIL	github.com/elastic/e2e-testing/e2e/_suites/fleet [build failed]
[2021-02-04T16:21:29.449Z] testing: warning: no tests to run
[2021-02-04T16:21:29.449Z] PASS _suites/helm
[2021-02-04T16:21:29.710Z] testing: warning: no tests to run
[2021-02-04T16:21:29.710Z] PASS _suites/metricbeat
[2021-02-04T16:21:29.710Z] 
[2021-02-04T16:21:29.710Z] === Errors
[2021-02-04T16:21:29.710Z] fleet/services_test.go:27:35: not enough arguments in call to downloadAgentBinary
[2021-02-04T16:21:29.710Z] 	have (string, string, string, string, string)
[2021-02-04T16:21:29.710Z] 	want (string, string, string, string, string, bool)
[2021-02-04T16:21:29.710Z] fleet/services_test.go:39:62: not enough arguments in call to downloadAgentBinary
[2021-02-04T16:21:29.710Z] 	have (string, string, string, string, string)
[2021-02-04T16:21:29.710Z] 	want (string, string, string, string, string, bool)
[2021-02-04T16:21:29.710Z] fleet/services_test.go:53:62: not enough arguments in call to downloadAgentBinary
[2021-02-04T16:21:29.710Z] 	have (string, string, string, string, string)
[2021-02-04T16:21:29.710Z] 	want (string, string, string, string, string, bool)
[2021-02-04T16:21:29.710Z] fleet/services_test.go:67:62: not enough arguments in call to downloadAgentBinary
[2021-02-04T16:21:29.710Z] 	have (string, string, string, string, string)
[2021-02-04T16:21:29.710Z] 	want (string, string, string, string, string, bool)
[2021-02-04T16:21:29.710Z] 
[2021-02-04T16:21:29.710Z] DONE 0 tests, 4 errors in 4.697s
[2021-02-04T16:21:29.710Z] Makefile:92: recipe for target 'unit-test' failed
[2021-02-04T16:21:29.710Z] make: *** [unit-test] Error 2
[2021-02-04T16:21:29.710Z] make: Leaving directory '/var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-699/src/github.com/elastic/e2e-testing/e2e'
[2021-02-04T16:21:30.828Z] Post stage
[2021-02-04T16:21:30.853Z] Recording test results
[2021-02-04T16:21:31.357Z] [Checks API] No suitable checks publisher found.
[2021-02-04T16:21:31.384Z] Archiving artifacts
[2021-02-04T16:21:31.602Z] Failed in branch Unit Tests
[2021-02-04T16:21:31.785Z] Stage "Build Docs" skipped due to earlier failure(s)
[2021-02-04T16:21:31.892Z] Stage "End-To-End Tests" skipped due to earlier failure(s)
[2021-02-04T16:21:32.061Z] Stage "Release" skipped due to earlier failure(s)
[2021-02-04T16:21:32.839Z] Running on Jenkins in /var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-699@2
[2021-02-04T16:21:32.950Z] [INFO] getVaultSecret: Getting secrets
[2021-02-04T16:21:33.046Z] Masking supported pattern matches of $VAULT_ADDR or $VAULT_ROLE_ID or $VAULT_SECRET_ID
[2021-02-04T16:21:34.072Z] + chmod 755 generate-build-data.sh
[2021-02-04T16:21:34.073Z] + ./generate-build-data.sh https://beats-ci.elastic.co/blue/rest/organizations/jenkins/pipelines/e2e-tests/e2e-testing-mbp/PR-699/ https://beats-ci.elastic.co/blue/rest/organizations/jenkins/pipelines/e2e-tests/e2e-testing-mbp/PR-699/runs/1 FAILURE 300741
[2021-02-04T16:21:34.323Z] INFO: curl https://beats-ci.elastic.co/blue/rest/organizations/jenkins/pipelines/e2e-tests/e2e-testing-mbp/PR-699/runs/1/steps/?limit=10000 -o steps-info.json
[2021-02-04T16:21:35.234Z] INFO: curl https://beats-ci.elastic.co/blue/rest/organizations/jenkins/pipelines/e2e-tests/e2e-testing-mbp/PR-699/runs/1/tests/?status=FAILED -o tests-errors.json

💚 Flaky test report

Tests succeeded.

Expand to view the summary

Test stats 🧪

Test Results
Failed 0
Passed 40
Skipped 7
Total 47

@EricDavisX
Copy link
Contributor

not really needed - the functionality doesn't exist in 6.8 (nor does Agent or Ingest Manager, etc)

Copy link
Contributor

@EricDavisX EricDavisX left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

suggest just to close it out, not needed

@mdelapenya
Copy link
Contributor Author

Closing as per Eric's comment. I backported in autopilot...

@mdelapenya mdelapenya closed this Feb 4, 2021
@elasticmachine
Copy link
Contributor

❕ Build Aborted

Either there was a build timeout or someone aborted the build.'}

the below badges are clickable and redirect to their specific view in the CI or DOCS
Pipeline View Test View Changes Artifacts

Expand to view the summary

Build stats

  • Build Cause: Pull request #699 updated

    • Start Time: 2021-02-04T16:26:19.355+0000
  • Duration: 41 min 24 sec

  • Commit: f745c9c

Test stats 🧪

Test Results
Failed 0
Passed 130
Skipped 9
Total 139

Log output

Expand to view the last 100 lines of log output

[2021-02-04T16:49:20.208Z] Removing fleet_package-registry_1 ... 
[2021-02-04T16:49:20.208Z] Removing fleet_elasticsearch_1    ... 
[2021-02-04T16:49:20.468Z] 
Removing fleet_package-registry_1 ... done

Removing fleet_kibana_1           ... done

Removing fleet_elasticsearch_1    ... done
Removing network fleet_default
[2021-02-04T16:49:20.468Z] time="2021-02-04T16:49:20Z" level=debug msg="Docker compose executed." cmd="[down --remove-orphans]" composeFilePaths="[/var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-699/.op/compose/profiles/fleet/docker-compose.yml]" env="map[centos_systemdAgentBinarySrcPath:/tmp/elastic-agent-8.0.0-SNAPSHOT-linux-x86_64.tar.gz890210786 centos_systemdAgentBinaryTargetPath:/elastic-agent-8.0.0-SNAPSHOT-x86_64.tar.gz centos_systemdContainerName:fleet_centos-systemd_elastic-agent_1 centos_systemdTag:latest kibanaConfigPath:/var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-699/src/github.com/elastic/e2e-testing/e2e/_suites/fleet/configurations/kibana.config.yml stackVersion:8.0.0-SNAPSHOT]" profile=fleet
[2021-02-04T16:49:20.727Z] time="2021-02-04T16:49:20Z" level=debug msg="Elastic Agent binary was removed." installer=centos-systemd-8.0.0-SNAPSHOT path=/tmp/elastic-agent-8.0.0-SNAPSHOT-x86_64.rpm871422671
[2021-02-04T16:49:20.727Z] time="2021-02-04T16:49:20Z" level=debug msg="Elastic Agent binary was removed." installer=centos-tar-8.0.0-SNAPSHOT path=/tmp/elastic-agent-8.0.0-SNAPSHOT-linux-x86_64.tar.gz890210786
[2021-02-04T16:49:20.727Z] time="2021-02-04T16:49:20Z" level=debug msg="Elastic Agent binary was removed." installer=debian-systemd-8.0.0-SNAPSHOT path=/tmp/elastic-agent-8.0.0-SNAPSHOT-amd64.deb520448985
[2021-02-04T16:49:20.777Z] [INFO] Stopping Filebeat Docker container
[2021-02-04T16:49:21.062Z] + docker exec -t 102d7db2825095c98edf8cde8df50b989406328557ff9cd2bc235c26c95776db chmod -R ugo+rw /output
[2021-02-04T16:49:21.321Z] + docker stop --time 30 102d7db2825095c98edf8cde8df50b989406328557ff9cd2bc235c26c95776db
[2021-02-04T16:49:21.579Z] 102d7db2825095c98edf8cde8df50b989406328557ff9cd2bc235c26c95776db
[2021-02-04T16:49:21.595Z] Archiving artifacts
[2021-02-04T16:49:21.655Z] time="2021-02-04T16:49:21Z" level=debug msg="Agent listed in Fleet with online status" agentID=ce027940-6708-11eb-baef-5302176b9cd4 hostname=81eda5ee2014
[2021-02-04T16:49:21.655Z] time="2021-02-04T16:49:21Z" level=warning msg="The Agent is not in the offline status yet" agentID=ce027940-6708-11eb-baef-5302176b9cd4 elapsedTime=50.895394404s hostname=81eda5ee2014 isAgentInStatus=false retry=14 status=offline
[2021-02-04T16:49:22.193Z] Recording test results
[2021-02-04T16:49:22.459Z] [Checks API] No suitable checks publisher found.
[2021-02-04T16:49:22.476Z] Archiving artifacts
[2021-02-04T16:49:29.774Z] time="2021-02-04T16:49:28Z" level=debug msg="Agent listed in Fleet with online status" agentID=ce027940-6708-11eb-baef-5302176b9cd4 hostname=81eda5ee2014
[2021-02-04T16:49:29.775Z] time="2021-02-04T16:49:28Z" level=warning msg="The Agent is not in the offline status yet" agentID=ce027940-6708-11eb-baef-5302176b9cd4 elapsedTime=58.307568732s hostname=81eda5ee2014 isAgentInStatus=false retry=15 status=offline
[2021-02-04T16:49:36.344Z] time="2021-02-04T16:49:35Z" level=debug msg="Agent listed in Fleet with online status" agentID=ce027940-6708-11eb-baef-5302176b9cd4 hostname=81eda5ee2014
[2021-02-04T16:49:36.344Z] time="2021-02-04T16:49:35Z" level=warning msg="The Agent is not in the offline status yet" agentID=ce027940-6708-11eb-baef-5302176b9cd4 elapsedTime=1m4.600645637s hostname=81eda5ee2014 isAgentInStatus=false retry=16 status=offline
[2021-02-04T16:49:39.642Z] time="2021-02-04T16:49:39Z" level=debug msg="Agent listed in Fleet with online status" agentID=ce027940-6708-11eb-baef-5302176b9cd4 hostname=81eda5ee2014
[2021-02-04T16:49:39.642Z] time="2021-02-04T16:49:39Z" level=warning msg="The Agent is not in the offline status yet" agentID=ce027940-6708-11eb-baef-5302176b9cd4 elapsedTime=1m8.610197412s hostname=81eda5ee2014 isAgentInStatus=false retry=17 status=offline
[2021-02-04T16:49:46.229Z] time="2021-02-04T16:49:45Z" level=debug msg="Agent listed in Fleet with online status" agentID=ce027940-6708-11eb-baef-5302176b9cd4 hostname=81eda5ee2014
[2021-02-04T16:49:46.229Z] time="2021-02-04T16:49:45Z" level=warning msg="The Agent is not in the offline status yet" agentID=ce027940-6708-11eb-baef-5302176b9cd4 elapsedTime=1m14.919047061s hostname=81eda5ee2014 isAgentInStatus=false retry=18 status=offline
[2021-02-04T16:49:48.770Z] time="2021-02-04T16:49:48Z" level=debug msg="Agent listed in Fleet with online status" agentID=ce027940-6708-11eb-baef-5302176b9cd4 hostname=81eda5ee2014
[2021-02-04T16:49:48.770Z] time="2021-02-04T16:49:48Z" level=warning msg="The Agent is not in the offline status yet" agentID=ce027940-6708-11eb-baef-5302176b9cd4 elapsedTime=1m18.210969697s hostname=81eda5ee2014 isAgentInStatus=false retry=19 status=offline
[2021-02-04T16:49:54.055Z] time="2021-02-04T16:49:53Z" level=debug msg="Agent listed in Fleet with online status" agentID=ce027940-6708-11eb-baef-5302176b9cd4 hostname=81eda5ee2014
[2021-02-04T16:49:54.055Z] time="2021-02-04T16:49:53Z" level=warning msg="The Agent is not in the offline status yet" agentID=ce027940-6708-11eb-baef-5302176b9cd4 elapsedTime=1m22.540691742s hostname=81eda5ee2014 isAgentInStatus=false retry=20 status=offline
[2021-02-04T16:50:00.627Z] time="2021-02-04T16:49:59Z" level=debug msg="Agent listed in Fleet with online status" agentID=ce027940-6708-11eb-baef-5302176b9cd4 hostname=81eda5ee2014
[2021-02-04T16:50:00.627Z] time="2021-02-04T16:49:59Z" level=warning msg="The Agent is not in the offline status yet" agentID=ce027940-6708-11eb-baef-5302176b9cd4 elapsedTime=1m29.241576039s hostname=81eda5ee2014 isAgentInStatus=false retry=21 status=offline
[2021-02-04T16:50:03.915Z] time="2021-02-04T16:50:03Z" level=debug msg="Agent listed in Fleet with online status" agentID=ce027940-6708-11eb-baef-5302176b9cd4 hostname=81eda5ee2014
[2021-02-04T16:50:03.915Z] time="2021-02-04T16:50:03Z" level=warning msg="The Agent is not in the offline status yet" agentID=ce027940-6708-11eb-baef-5302176b9cd4 elapsedTime=1m32.931208876s hostname=81eda5ee2014 isAgentInStatus=false retry=22 status=offline
[2021-02-04T16:50:09.185Z] time="2021-02-04T16:50:09Z" level=debug msg="Agent listed in Fleet with online status" agentID=ce027940-6708-11eb-baef-5302176b9cd4 hostname=81eda5ee2014
[2021-02-04T16:50:09.185Z] time="2021-02-04T16:50:09Z" level=warning msg="The Agent is not in the offline status yet" agentID=ce027940-6708-11eb-baef-5302176b9cd4 elapsedTime=1m38.602492888s hostname=81eda5ee2014 isAgentInStatus=false retry=23 status=offline
[2021-02-04T16:50:14.543Z] time="2021-02-04T16:50:14Z" level=debug msg="Agent listed in Fleet with online status" agentID=ce027940-6708-11eb-baef-5302176b9cd4 hostname=81eda5ee2014
[2021-02-04T16:50:14.543Z] time="2021-02-04T16:50:14Z" level=warning msg="The Agent is not in the offline status yet" agentID=ce027940-6708-11eb-baef-5302176b9cd4 elapsedTime=1m43.624318582s hostname=81eda5ee2014 isAgentInStatus=false retry=24 status=offline
[2021-02-04T16:50:17.832Z] time="2021-02-04T16:50:17Z" level=debug msg="Agent listed in Fleet with online status" agentID=ce027940-6708-11eb-baef-5302176b9cd4 hostname=81eda5ee2014
[2021-02-04T16:50:17.832Z] time="2021-02-04T16:50:17Z" level=warning msg="The Agent is not in the offline status yet" agentID=ce027940-6708-11eb-baef-5302176b9cd4 elapsedTime=1m46.600757659s hostname=81eda5ee2014 isAgentInStatus=false retry=25 status=offline
[2021-02-04T16:50:20.365Z] time="2021-02-04T16:50:19Z" level=debug msg="Agent listed in Fleet with online status" agentID=ce027940-6708-11eb-baef-5302176b9cd4 hostname=81eda5ee2014
[2021-02-04T16:50:20.365Z] time="2021-02-04T16:50:19Z" level=warning msg="The Agent is not in the offline status yet" agentID=ce027940-6708-11eb-baef-5302176b9cd4 elapsedTime=1m49.253451533s hostname=81eda5ee2014 isAgentInStatus=false retry=26 status=offline
[2021-02-04T16:50:24.556Z] time="2021-02-04T16:50:24Z" level=debug msg="Agent listed in Fleet with online status" agentID=ce027940-6708-11eb-baef-5302176b9cd4 hostname=81eda5ee2014
[2021-02-04T16:50:24.556Z] time="2021-02-04T16:50:24Z" level=warning msg="The Agent is not in the offline status yet" agentID=ce027940-6708-11eb-baef-5302176b9cd4 elapsedTime=1m53.747019264s hostname=81eda5ee2014 isAgentInStatus=false retry=27 status=offline
[2021-02-04T16:50:29.827Z] time="2021-02-04T16:50:29Z" level=debug msg="Agent listed in Fleet with online status" agentID=ce027940-6708-11eb-baef-5302176b9cd4 hostname=81eda5ee2014
[2021-02-04T16:50:29.827Z] time="2021-02-04T16:50:29Z" level=warning msg="The Agent is not in the offline status yet" agentID=ce027940-6708-11eb-baef-5302176b9cd4 elapsedTime=1m59.21941251s hostname=81eda5ee2014 isAgentInStatus=false retry=28 status=offline
[2021-02-04T16:50:37.944Z] time="2021-02-04T16:50:36Z" level=debug msg="Agent listed in Fleet with online status" agentID=ce027940-6708-11eb-baef-5302176b9cd4 hostname=81eda5ee2014
[2021-02-04T16:50:37.944Z] time="2021-02-04T16:50:36Z" level=warning msg="The Agent is not in the offline status yet" agentID=ce027940-6708-11eb-baef-5302176b9cd4 elapsedTime=2m6.40638058s hostname=81eda5ee2014 isAgentInStatus=false retry=29 status=offline
[2021-02-04T16:50:43.218Z] time="2021-02-04T16:50:42Z" level=debug msg="Agent listed in Fleet with online status" agentID=ce027940-6708-11eb-baef-5302176b9cd4 hostname=81eda5ee2014
[2021-02-04T16:50:43.218Z] time="2021-02-04T16:50:42Z" level=warning msg="The Agent is not in the offline status yet" agentID=ce027940-6708-11eb-baef-5302176b9cd4 elapsedTime=2m11.807278997s hostname=81eda5ee2014 isAgentInStatus=false retry=30 status=offline
[2021-02-04T16:50:48.491Z] time="2021-02-04T16:50:47Z" level=debug msg="Agent listed in Fleet with online status" agentID=ce027940-6708-11eb-baef-5302176b9cd4 hostname=81eda5ee2014
[2021-02-04T16:50:48.491Z] time="2021-02-04T16:50:47Z" level=warning msg="The Agent is not in the offline status yet" agentID=ce027940-6708-11eb-baef-5302176b9cd4 elapsedTime=2m17.291041532s hostname=81eda5ee2014 isAgentInStatus=false retry=31 status=offline
[2021-02-04T16:50:52.686Z] time="2021-02-04T16:50:52Z" level=debug msg="Agent listed in Fleet with online status" agentID=ce027940-6708-11eb-baef-5302176b9cd4 hostname=81eda5ee2014
[2021-02-04T16:50:52.686Z] time="2021-02-04T16:50:52Z" level=warning msg="The Agent is not in the offline status yet" agentID=ce027940-6708-11eb-baef-5302176b9cd4 elapsedTime=2m21.886244768s hostname=81eda5ee2014 isAgentInStatus=false retry=32 status=offline
[2021-02-04T16:50:57.957Z] time="2021-02-04T16:50:57Z" level=debug msg="Agent listed in Fleet with online status" agentID=ce027940-6708-11eb-baef-5302176b9cd4 hostname=81eda5ee2014
[2021-02-04T16:50:57.957Z] time="2021-02-04T16:50:57Z" level=warning msg="The Agent is not in the offline status yet" agentID=ce027940-6708-11eb-baef-5302176b9cd4 elapsedTime=2m27.192329932s hostname=81eda5ee2014 isAgentInStatus=false retry=33 status=offline
[2021-02-04T16:51:03.229Z] time="2021-02-04T16:51:02Z" level=debug msg="Agent listed in Fleet with online status" agentID=ce027940-6708-11eb-baef-5302176b9cd4 hostname=81eda5ee2014
[2021-02-04T16:51:03.229Z] time="2021-02-04T16:51:02Z" level=info msg="The Agent is in the desired status" elapsedTime=2m32.184665973s hostname=81eda5ee2014 isAgentInStatus=true retries=34 status=offline
[2021-02-04T16:51:03.798Z] OCI runtime exec failed: exec failed: container_linux.go:370: starting container process caused: exec: "elastic-agent": executable file not found in $PATH: unknown
[2021-02-04T16:51:03.798Z] time="2021-02-04T16:51:03Z" level=error msg="Could not execute command in container" command="[elastic-agent uninstall -f]" error="Could not run compose file: [/var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-699/.op/compose/profiles/fleet/docker-compose.yml /var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-699/.op/compose/services/debian-systemd/docker-compose.yml] - Local Docker compose exited abnormally whilst running docker-compose: [exec -T debian-systemd elastic-agent uninstall -f]. exit status 126" service=debian-systemd
[2021-02-04T16:51:03.798Z] time="2021-02-04T16:51:03Z" level=error msg="Could not run agent command in the box" command="[elastic-agent uninstall -f]" error="Could not run compose file: [/var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-699/.op/compose/profiles/fleet/docker-compose.yml /var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-699/.op/compose/services/debian-systemd/docker-compose.yml] - Local Docker compose exited abnormally whilst running docker-compose: [exec -T debian-systemd elastic-agent uninstall -f]. exit status 126" profile=fleet service=debian-systemd
[2021-02-04T16:51:03.798Z] time="2021-02-04T16:51:03Z" level=warning msg="Could not uninstall the agent after the scenario: Could not run compose file: [/var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-699/.op/compose/profiles/fleet/docker-compose.yml /var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-699/.op/compose/services/debian-systemd/docker-compose.yml] - Local Docker compose exited abnormally whilst running docker-compose: [exec -T debian-systemd elastic-agent uninstall -f]. exit status 126"
[2021-02-04T16:51:03.799Z] time="2021-02-04T16:51:03Z" level=debug msg="Un-enrolling agent in Fleet" agentID=ce027940-6708-11eb-baef-5302176b9cd4 hostname=81eda5ee2014
[2021-02-04T16:51:05.178Z] time="2021-02-04T16:51:05Z" level=debug msg="Fleet agent was unenrolled" agentID=ce027940-6708-11eb-baef-5302176b9cd4
[2021-02-04T16:51:06.115Z] Stopping fleet_debian-systemd_elastic-agent_1 ... 
[2021-02-04T16:51:06.684Z] 
Stopping fleet_debian-systemd_elastic-agent_1 ... done
Removing fleet_debian-systemd_elastic-agent_1 ... 
[2021-02-04T16:51:06.684Z] 
Removing fleet_debian-systemd_elastic-agent_1 ... done
Going to remove fleet_debian-systemd_elastic-agent_1
[2021-02-04T16:51:06.943Z] time="2021-02-04T16:51:06Z" level=debug msg="Docker compose executed." cmd="[rm -fvs debian-systemd]" composeFilePaths="[/var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-699/.op/compose/profiles/fleet/docker-compose.yml /var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-699/.op/compose/services/debian-systemd/docker-compose.yml]" env="map[centos_systemdAgentBinarySrcPath:/tmp/elastic-agent-8.0.0-SNAPSHOT-linux-x86_64.tar.gz166824276 centos_systemdAgentBinaryTargetPath:/elastic-agent-8.0.0-SNAPSHOT-x86_64.tar.gz centos_systemdContainerName:fleet_centos-systemd_elastic-agent_1 centos_systemdTag:latest debian_systemdAgentBinarySrcPath:/tmp/elastic-agent-8.0.0-SNAPSHOT-linux-x86_64.tar.gz166824276 debian_systemdAgentBinaryTargetPath:/elastic-agent-8.0.0-SNAPSHOT-x86_64.tar.gz debian_systemdContainerName:fleet_debian-systemd_elastic-agent_1 debian_systemdTag:stretch kibanaConfigPath:/var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-699/src/github.com/elastic/e2e-testing/e2e/_suites/fleet/configurations/kibana.config.yml stackVersion:8.0.0-SNAPSHOT]" profile=fleet
[2021-02-04T16:51:06.943Z] time="2021-02-04T16:51:06Z" level=debug msg="Service removed from compose" profile=fleet service=debian-systemd
[2021-02-04T16:51:08.368Z] time="2021-02-04T16:51:08Z" level=debug msg="The token was deleted" tokenID=c9085d60-6708-11eb-baef-5302176b9cd4
[2021-02-04T16:51:08.368Z] time="2021-02-04T16:51:08Z" level=info msg="Integration deleted from the configuration" integration= packageConfigId= policyID=dedf6860-6706-11eb-baef-5302176b9cd4 version=
[2021-02-04T16:51:08.368Z] time="2021-02-04T16:51:08Z" level=debug msg="Destroying Fleet runtime dependencies"
[2021-02-04T16:51:08.937Z] Stopping fleet_kibana_1           ... 
[2021-02-04T16:51:08.937Z] Stopping fleet_package-registry_1 ... 
[2021-02-04T16:51:08.937Z] Stopping fleet_elasticsearch_1    ... 
[2021-02-04T16:51:10.288Z] 
Stopping fleet_kibana_1           ... done

Stopping fleet_package-registry_1 ... done

Stopping fleet_elasticsearch_1    ... done
Removing fleet_kibana_1           ... 
[2021-02-04T16:51:10.288Z] Removing fleet_package-registry_1 ... 
[2021-02-04T16:51:10.288Z] Removing fleet_elasticsearch_1    ... 
[2021-02-04T16:51:10.288Z] 
Removing fleet_package-registry_1 ... done

Removing fleet_kibana_1           ... done

Removing fleet_elasticsearch_1    ... done
Removing network fleet_default
[2021-02-04T16:51:10.548Z] time="2021-02-04T16:51:10Z" level=debug msg="Docker compose executed." cmd="[down --remove-orphans]" composeFilePaths="[/var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-699/.op/compose/profiles/fleet/docker-compose.yml]" env="map[centos_systemdAgentBinarySrcPath:/tmp/elastic-agent-8.0.0-SNAPSHOT-linux-x86_64.tar.gz166824276 centos_systemdAgentBinaryTargetPath:/elastic-agent-8.0.0-SNAPSHOT-x86_64.tar.gz centos_systemdContainerName:fleet_centos-systemd_elastic-agent_1 centos_systemdTag:latest debian_systemdAgentBinarySrcPath:/tmp/elastic-agent-8.0.0-SNAPSHOT-linux-x86_64.tar.gz166824276 debian_systemdAgentBinaryTargetPath:/elastic-agent-8.0.0-SNAPSHOT-x86_64.tar.gz debian_systemdContainerName:fleet_debian-systemd_elastic-agent_1 debian_systemdTag:stretch kibanaConfigPath:/var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-699/src/github.com/elastic/e2e-testing/e2e/_suites/fleet/configurations/kibana.config.yml stackVersion:8.0.0-SNAPSHOT]" profile=fleet
[2021-02-04T16:51:10.548Z] time="2021-02-04T16:51:10Z" level=debug msg="Elastic Agent binary was removed." installer=debian-tar-8.0.0-SNAPSHOT path=/tmp/elastic-agent-8.0.0-SNAPSHOT-linux-x86_64.tar.gz166824276
[2021-02-04T16:51:10.548Z] time="2021-02-04T16:51:10Z" level=debug msg="Elastic Agent binary was removed." installer=centos-systemd-8.0.0-SNAPSHOT path=/tmp/elastic-agent-8.0.0-SNAPSHOT-x86_64.rpm687280009
[2021-02-04T16:51:10.548Z] time="2021-02-04T16:51:10Z" level=debug msg="Elastic Agent binary was removed." installer=debian-systemd-8.0.0-SNAPSHOT path=/tmp/elastic-agent-8.0.0-SNAPSHOT-amd64.deb505825443
[2021-02-04T16:51:10.595Z] [INFO] Stopping Filebeat Docker container
[2021-02-04T16:51:10.881Z] + docker exec -t b34466a1f33ca3d129d12096a70c615d0c49abd3d58d3c849e19f8396a040d36 chmod -R ugo+rw /output
[2021-02-04T16:51:11.140Z] + docker stop --time 30 b34466a1f33ca3d129d12096a70c615d0c49abd3d58d3c849e19f8396a040d36
[2021-02-04T16:51:11.398Z] b34466a1f33ca3d129d12096a70c615d0c49abd3d58d3c849e19f8396a040d36
[2021-02-04T16:51:11.421Z] Archiving artifacts
[2021-02-04T16:51:12.185Z] Recording test results
[2021-02-04T16:51:12.550Z] [Checks API] No suitable checks publisher found.
[2021-02-04T16:51:12.570Z] Archiving artifacts
[2021-02-04T17:07:38.840Z] Aborted by Manuel de la Peña
[2021-02-04T17:07:38.860Z] Failed in branch ubuntu-18.04_metricbeat_integrations && redisenterprise
[2021-02-04T17:07:39.957Z] Stage "Release" skipped due to earlier failure(s)
[2021-02-04T17:07:42.847Z] Running on Jenkins in /var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-699@2
[2021-02-04T17:07:42.898Z] [INFO] getVaultSecret: Getting secrets
[2021-02-04T17:07:43.052Z] Masking supported pattern matches of $VAULT_ADDR or $VAULT_ROLE_ID or $VAULT_SECRET_ID
[2021-02-04T17:07:43.820Z] + chmod 755 generate-build-data.sh
[2021-02-04T17:07:43.820Z] + ./generate-build-data.sh https://beats-ci.elastic.co/blue/rest/organizations/jenkins/pipelines/e2e-tests/e2e-testing-mbp/PR-699/ https://beats-ci.elastic.co/blue/rest/organizations/jenkins/pipelines/e2e-tests/e2e-testing-mbp/PR-699/runs/2 ABORTED 2484201
[2021-02-04T17:07:44.084Z] INFO: curl https://beats-ci.elastic.co/blue/rest/organizations/jenkins/pipelines/e2e-tests/e2e-testing-mbp/PR-699/runs/2/steps/?limit=10000 -o steps-info.json
[2021-02-04T17:07:45.427Z] INFO: curl https://beats-ci.elastic.co/blue/rest/organizations/jenkins/pipelines/e2e-tests/e2e-testing-mbp/PR-699/runs/2/tests/?status=FAILED -o tests-errors.json

@mdelapenya mdelapenya deleted the backport/6.8.x/pr-671 branch June 2, 2021 05:38
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants