Skip to content
This repository has been archived by the owner on Sep 17, 2024. It is now read-only.

fix: do not override compose configurations when syncing from beats #630

Merged
merged 5 commits into from
Jan 22, 2021

Conversation

mdelapenya
Copy link
Contributor

What does this PR do?

It moves the variable holding the references to a single service into the loop-for that is responsible of creating the service.

In the process, we added unit tests to the sanitize compose file feature, and for that we had to change method signature to support writing to a different file in the test side (adding testability).

Why is it important?

We noticed that ceph was not receiving all the ports declared in its docker-compose file:

version: '2.3'

services:
  ceph:
    image: docker.elastic.co/integrations-ci/beats-ceph:${CEPH_VERSION:-master-97985eb-nautilus-centos-7-x86_64}-2
    build:
      context: ./_meta
      dockerfile: Dockerfile.${CEPH_CODENAME:-nautilus}
      args:
        CEPH_VERSION: ${CEPH_VERSION:-master-97985eb-nautilus-centos-7-x86_64}
    ports:
      - 5000
      - 8003
      - 8080
  ceph-api:
    image: docker.elastic.co/integrations-ci/beats-ceph:master-6373c6a-jewel-centos-7-x86_64-1
    build:
      context: ./_meta
      dockerfile: Dockerfile.jewel
      args:
        CEPH_VERSION: master-6373c6a-jewel-centos-7-x86_64
    ports:
      - 5000

In the contrary it was overriden by the second service in the compose, ceph-api:

version: "2.3"
services:
  ceph:
    image: docker.elastic.co/integrations-ci/beats-ceph:${CEPH_VERSION:-master-97985eb-nautilus-centos-7-x86_64}-2
    ports:
    - 5000
  ceph-api:
    image: docker.elastic.co/integrations-ci/beats-ceph:master-6373c6a-jewel-centos-7-x86_64-1
    ports:
    - 5000

And this could lead to unexpected errors.

Checklist

  • My code follows the style guidelines of this project
  • I have commented my code, particularly in hard-to-understand areas
  • I have made corresponding changes to the documentation
  • I have made corresponding change to the default configuration files
  • I have added tests that prove my fix is effective or that my feature works
  • I have run the Unit tests for the CLI, and they are passing locally
  • I have run the End-2-End tests for the suite I'm working on, and they are passing locally
  • I have noticed new Go dependencies (run make notice in the proper directory)

Author's Checklist

  • Verified ceph is having multiple ports

How to test this PR locally

$ make -C cli test
...
PASS cmd.TestSanitizeComposeFile_Multiple (0.00s)
PASS cmd.TestSanitizeComposeFile_Single (0.00s)
PASS cmd
...

Related issues

Follow-ups

We should check https://github.com/elastic/beats/blob/master/metricbeat/module/ceph/docker-compose.yml#L5-L16, as it seems there is a hardcoded value for the image in ceph-api.

@mdelapenya mdelapenya self-assigned this Jan 20, 2021
@mdelapenya mdelapenya requested review from a team and jsoriano January 20, 2021 21:36
@mdelapenya mdelapenya added the bug Something isn't working label Jan 20, 2021
@elasticmachine
Copy link
Contributor

elasticmachine commented Jan 20, 2021

💔 Tests Failed

the below badges are clickable and redirect to their specific view in the CI or DOCS
Pipeline View Test View Changes Artifacts preview

Expand to view the summary

Build stats

  • Build Cause: Started by user Manuel de la Peña

    • Start Time: 2021-01-20T23:08:38.784+0000
  • Duration: 23 min 42 sec

  • Commit: c636d2a

Test stats 🧪

Test Results
Failed 1
Passed 96
Skipped 19
Total 116

Test errors 1

Expand to view the tests failures

Initializing / End-To-End Tests / metricbeat_integrations && ceph / ceph-master-97985eb-nautilus-centos-7-x86_64 sends metrics to Elasticsearch without errors – Integrations
    Expand to view the error details

     Step there are no errors in the index: Errors where found for ceph-8.0.0-SNAPSHOT on Metricbeat's metricbeat-8.0.0-ceph-master-97985eb-nautilus-centos-7-x86_64-4dipqnr1 index: 10 error/s out of 10 
    

  • no stacktrace

Steps errors 4

Expand to view the steps failures

Run functional tests for metricbeat:integrations && ceph
  • Took 9 min 58 sec . View more details on here
  • Description: .ci/scripts/functional-test.sh "metricbeat" "integrations && ceph" "8.0.0-SNAPSHOT" "8.0.0-SNAPSHOT"
Archive the artifacts
  • Took 0 min 1 sec . View more details on here
  • Description: [2021-01-20T23:23:45.590Z] Archiving artifacts script returned exit code 2
Archive the artifacts
  • Took 0 min 0 sec . View more details on here
  • Description: [2021-01-20T23:23:47.027Z] Archiving artifacts hudson.AbortException: script returned exit code 2
Error signal
  • Took 0 min 0 sec . View more details on here
  • Description: hudson.AbortException: script returned exit code 2

Log output

Expand to view the last 100 lines of log output

[2021-01-20T23:29:36.007Z] time="2021-01-20T23:29:35Z" level=debug msg="Service removed from compose" profile=metricbeat service=metricbeat
[2021-01-20T23:29:36.575Z] Stopping metricbeat_mysql_1 ... 
[2021-01-20T23:29:37.153Z] time="2021-01-20T23:29:36Z" level=debug msg="Agent listed in Fleet with online status" agentID=476f89a0-5b77-11eb-80c0-4b90fef12d0a hostname=52ac682fd7cf
[2021-01-20T23:29:37.154Z] time="2021-01-20T23:29:36Z" level=warning msg="The Agent is not in the offline status yet" agentID=476f89a0-5b77-11eb-80c0-4b90fef12d0a elapsedTime=31.883638398s hostname=52ac682fd7cf isAgentInStatus=false retry=10 status=offline
[2021-01-20T23:29:39.875Z] 
Stopping metricbeat_mysql_1 ... done
Removing metricbeat_mysql_1 ... 
[2021-01-20T23:29:39.875Z] 
Removing metricbeat_mysql_1 ... done
Going to remove metricbeat_mysql_1
[2021-01-20T23:29:39.875Z] time="2021-01-20T23:29:39Z" level=debug msg="Docker compose executed." cmd="[rm -fvs mysql]" composeFilePaths="[/var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-630/.op/compose/profiles/metricbeat/docker-compose.yml /var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-630/.op/compose/services/metricbeat/docker-compose.yml /var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-630/.op/compose/services/mysql/docker-compose.yml]" env="map[BEAT_STRICT_PERMS:false MYSQL_PATH:/var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-630/.op/compose/services/mysql MYSQL_VARIANT:percona MYSQL_VERSION:8.0.13-4 indexName:metricbeat-8.0.0-mysql-percona-8.0.13-4-p6aisdrx logLevel:debug metricbeatConfigFile:/var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-630/.op/compose/services/mysql/_meta/config.yml metricbeatDockerNamespace:beats metricbeatTag:8.0.0-SNAPSHOT mysqlTag:8.0.13-4 serviceName:mysql stackVersion:8.0.0-SNAPSHOT]" profile=metricbeat
[2021-01-20T23:29:39.875Z] time="2021-01-20T23:29:39Z" level=debug msg="Service removed from compose" profile=metricbeat service=mysql
[2021-01-20T23:29:39.875Z] time="2021-01-20T23:29:39Z" level=debug msg="Index deleted using Elasticsearch Go client" indexName=metricbeat-8.0.0-mysql-percona-8.0.13-4-p6aisdrx status="400 Bad Request"
[2021-01-20T23:29:39.875Z] time="2021-01-20T23:29:39Z" level=debug msg="Index Alias deleted using Elasticsearch Go client" indexAlias=metricbeat-8.0.0-mysql-percona-8.0.13-4-p6aisdrx status="400 Bad Request"
[2021-01-20T23:29:40.448Z] Stopping metricbeat_elasticsearch_1 ... 
[2021-01-20T23:29:41.398Z] 
Stopping metricbeat_elasticsearch_1 ... done
Removing metricbeat_elasticsearch_1 ... 
[2021-01-20T23:29:41.398Z] 
Removing metricbeat_elasticsearch_1 ... done
Removing network metricbeat_default
[2021-01-20T23:29:41.663Z] time="2021-01-20T23:29:41Z" level=debug msg="Docker compose executed." cmd="[down --remove-orphans]" composeFilePaths="[/var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-630/.op/compose/profiles/metricbeat/docker-compose.yml]" env="map[BEAT_STRICT_PERMS:false MYSQL_PATH:/var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-630/.op/compose/services/mysql MYSQL_VARIANT:percona MYSQL_VERSION:8.0.13-4 indexName:metricbeat-8.0.0-mysql-percona-8.0.13-4-p6aisdrx logLevel:debug metricbeatConfigFile:/var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-630/.op/compose/services/mysql/_meta/config.yml metricbeatDockerNamespace:beats metricbeatTag:8.0.0-SNAPSHOT mysqlTag:8.0.13-4 serviceName:mysql stackVersion:8.0.0-SNAPSHOT]" profile=metricbeat
[2021-01-20T23:29:41.717Z] [INFO] Stopping Filebeat Docker container
[2021-01-20T23:29:42.010Z] + docker exec -t 708352fa46a693d8fa5ae9b49989355aa8e9015cb135b75a742fe81477b1e922 chmod -R ugo+rw /output
[2021-01-20T23:29:42.584Z] + docker stop --time 30 708352fa46a693d8fa5ae9b49989355aa8e9015cb135b75a742fe81477b1e922
[2021-01-20T23:29:42.846Z] 708352fa46a693d8fa5ae9b49989355aa8e9015cb135b75a742fe81477b1e922
[2021-01-20T23:29:42.863Z] Archiving artifacts
[2021-01-20T23:29:43.710Z] time="2021-01-20T23:29:43Z" level=debug msg="Agent listed in Fleet with online status" agentID=476f89a0-5b77-11eb-80c0-4b90fef12d0a hostname=52ac682fd7cf
[2021-01-20T23:29:43.710Z] time="2021-01-20T23:29:43Z" level=warning msg="The Agent is not in the offline status yet" agentID=476f89a0-5b77-11eb-80c0-4b90fef12d0a elapsedTime=38.807527238s hostname=52ac682fd7cf isAgentInStatus=false retry=11 status=offline
[2021-01-20T23:29:43.957Z] Recording test results
[2021-01-20T23:29:44.593Z] [Checks API] No suitable checks publisher found.
[2021-01-20T23:29:44.612Z] Archiving artifacts
[2021-01-20T23:29:48.970Z] time="2021-01-20T23:29:48Z" level=debug msg="Agent listed in Fleet with online status" agentID=476f89a0-5b77-11eb-80c0-4b90fef12d0a hostname=52ac682fd7cf
[2021-01-20T23:29:48.970Z] time="2021-01-20T23:29:48Z" level=warning msg="The Agent is not in the offline status yet" agentID=476f89a0-5b77-11eb-80c0-4b90fef12d0a elapsedTime=43.670884754s hostname=52ac682fd7cf isAgentInStatus=false retry=12 status=offline
[2021-01-20T23:29:54.232Z] time="2021-01-20T23:29:53Z" level=debug msg="Agent listed in Fleet with online status" agentID=476f89a0-5b77-11eb-80c0-4b90fef12d0a hostname=52ac682fd7cf
[2021-01-20T23:29:54.232Z] time="2021-01-20T23:29:53Z" level=warning msg="The Agent is not in the offline status yet" agentID=476f89a0-5b77-11eb-80c0-4b90fef12d0a elapsedTime=49.20000449s hostname=52ac682fd7cf isAgentInStatus=false retry=13 status=offline
[2021-01-20T23:29:56.758Z] time="2021-01-20T23:29:56Z" level=debug msg="Agent listed in Fleet with online status" agentID=476f89a0-5b77-11eb-80c0-4b90fef12d0a hostname=52ac682fd7cf
[2021-01-20T23:29:56.759Z] time="2021-01-20T23:29:56Z" level=warning msg="The Agent is not in the offline status yet" agentID=476f89a0-5b77-11eb-80c0-4b90fef12d0a elapsedTime=51.869292967s hostname=52ac682fd7cf isAgentInStatus=false retry=14 status=offline
[2021-01-20T23:30:03.316Z] time="2021-01-20T23:30:03Z" level=debug msg="Agent listed in Fleet with online status" agentID=476f89a0-5b77-11eb-80c0-4b90fef12d0a hostname=52ac682fd7cf
[2021-01-20T23:30:03.316Z] time="2021-01-20T23:30:03Z" level=warning msg="The Agent is not in the offline status yet" agentID=476f89a0-5b77-11eb-80c0-4b90fef12d0a elapsedTime=58.674942312s hostname=52ac682fd7cf isAgentInStatus=false retry=15 status=offline
[2021-01-20T23:30:07.499Z] time="2021-01-20T23:30:07Z" level=debug msg="Agent listed in Fleet with online status" agentID=476f89a0-5b77-11eb-80c0-4b90fef12d0a hostname=52ac682fd7cf
[2021-01-20T23:30:07.499Z] time="2021-01-20T23:30:07Z" level=warning msg="The Agent is not in the offline status yet" agentID=476f89a0-5b77-11eb-80c0-4b90fef12d0a elapsedTime=1m2.447739273s hostname=52ac682fd7cf isAgentInStatus=false retry=16 status=offline
[2021-01-20T23:30:12.768Z] time="2021-01-20T23:30:12Z" level=debug msg="Agent listed in Fleet with online status" agentID=476f89a0-5b77-11eb-80c0-4b90fef12d0a hostname=52ac682fd7cf
[2021-01-20T23:30:13.027Z] time="2021-01-20T23:30:12Z" level=warning msg="The Agent is not in the offline status yet" agentID=476f89a0-5b77-11eb-80c0-4b90fef12d0a elapsedTime=1m8.193959759s hostname=52ac682fd7cf isAgentInStatus=false retry=17 status=offline
[2021-01-20T23:30:17.211Z] time="2021-01-20T23:30:16Z" level=debug msg="Agent listed in Fleet with online status" agentID=476f89a0-5b77-11eb-80c0-4b90fef12d0a hostname=52ac682fd7cf
[2021-01-20T23:30:17.211Z] time="2021-01-20T23:30:16Z" level=warning msg="The Agent is not in the offline status yet" agentID=476f89a0-5b77-11eb-80c0-4b90fef12d0a elapsedTime=1m12.006825367s hostname=52ac682fd7cf isAgentInStatus=false retry=18 status=offline
[2021-01-20T23:30:20.493Z] time="2021-01-20T23:30:19Z" level=debug msg="Agent listed in Fleet with online status" agentID=476f89a0-5b77-11eb-80c0-4b90fef12d0a hostname=52ac682fd7cf
[2021-01-20T23:30:20.493Z] time="2021-01-20T23:30:19Z" level=warning msg="The Agent is not in the offline status yet" agentID=476f89a0-5b77-11eb-80c0-4b90fef12d0a elapsedTime=1m15.399225078s hostname=52ac682fd7cf isAgentInStatus=false retry=19 status=offline
[2021-01-20T23:30:25.757Z] time="2021-01-20T23:30:25Z" level=debug msg="Agent listed in Fleet with online status" agentID=476f89a0-5b77-11eb-80c0-4b90fef12d0a hostname=52ac682fd7cf
[2021-01-20T23:30:25.757Z] time="2021-01-20T23:30:25Z" level=warning msg="The Agent is not in the offline status yet" agentID=476f89a0-5b77-11eb-80c0-4b90fef12d0a elapsedTime=1m20.888112257s hostname=52ac682fd7cf isAgentInStatus=false retry=20 status=offline
[2021-01-20T23:30:32.317Z] time="2021-01-20T23:30:32Z" level=debug msg="Agent listed in Fleet with online status" agentID=476f89a0-5b77-11eb-80c0-4b90fef12d0a hostname=52ac682fd7cf
[2021-01-20T23:30:32.317Z] time="2021-01-20T23:30:32Z" level=warning msg="The Agent is not in the offline status yet" agentID=476f89a0-5b77-11eb-80c0-4b90fef12d0a elapsedTime=1m27.53888849s hostname=52ac682fd7cf isAgentInStatus=false retry=21 status=offline
[2021-01-20T23:30:38.873Z] time="2021-01-20T23:30:38Z" level=debug msg="Agent listed in Fleet with online status" agentID=476f89a0-5b77-11eb-80c0-4b90fef12d0a hostname=52ac682fd7cf
[2021-01-20T23:30:38.873Z] time="2021-01-20T23:30:38Z" level=warning msg="The Agent is not in the offline status yet" agentID=476f89a0-5b77-11eb-80c0-4b90fef12d0a elapsedTime=1m33.549110195s hostname=52ac682fd7cf isAgentInStatus=false retry=22 status=offline
[2021-01-20T23:30:41.404Z] time="2021-01-20T23:30:40Z" level=debug msg="Agent listed in Fleet with online status" agentID=476f89a0-5b77-11eb-80c0-4b90fef12d0a hostname=52ac682fd7cf
[2021-01-20T23:30:41.404Z] time="2021-01-20T23:30:40Z" level=warning msg="The Agent is not in the offline status yet" agentID=476f89a0-5b77-11eb-80c0-4b90fef12d0a elapsedTime=1m36.228418904s hostname=52ac682fd7cf isAgentInStatus=false retry=23 status=offline
[2021-01-20T23:30:46.680Z] time="2021-01-20T23:30:46Z" level=debug msg="Agent listed in Fleet with online status" agentID=476f89a0-5b77-11eb-80c0-4b90fef12d0a hostname=52ac682fd7cf
[2021-01-20T23:30:46.680Z] time="2021-01-20T23:30:46Z" level=warning msg="The Agent is not in the offline status yet" agentID=476f89a0-5b77-11eb-80c0-4b90fef12d0a elapsedTime=1m41.508300491s hostname=52ac682fd7cf isAgentInStatus=false retry=24 status=offline
[2021-01-20T23:30:54.939Z] time="2021-01-20T23:30:53Z" level=debug msg="Agent listed in Fleet with online status" agentID=476f89a0-5b77-11eb-80c0-4b90fef12d0a hostname=52ac682fd7cf
[2021-01-20T23:30:54.939Z] time="2021-01-20T23:30:53Z" level=warning msg="The Agent is not in the offline status yet" agentID=476f89a0-5b77-11eb-80c0-4b90fef12d0a elapsedTime=1m48.919964077s hostname=52ac682fd7cf isAgentInStatus=false retry=25 status=offline
[2021-01-20T23:31:00.256Z] time="2021-01-20T23:30:59Z" level=debug msg="Agent listed in Fleet with online status" agentID=476f89a0-5b77-11eb-80c0-4b90fef12d0a hostname=52ac682fd7cf
[2021-01-20T23:31:00.256Z] time="2021-01-20T23:30:59Z" level=warning msg="The Agent is not in the offline status yet" agentID=476f89a0-5b77-11eb-80c0-4b90fef12d0a elapsedTime=1m55.215107649s hostname=52ac682fd7cf isAgentInStatus=false retry=26 status=offline
[2021-01-20T23:31:04.445Z] time="2021-01-20T23:31:03Z" level=debug msg="Agent listed in Fleet with online status" agentID=476f89a0-5b77-11eb-80c0-4b90fef12d0a hostname=52ac682fd7cf
[2021-01-20T23:31:04.445Z] time="2021-01-20T23:31:03Z" level=warning msg="The Agent is not in the offline status yet" agentID=476f89a0-5b77-11eb-80c0-4b90fef12d0a elapsedTime=1m59.275191729s hostname=52ac682fd7cf isAgentInStatus=false retry=27 status=offline
[2021-01-20T23:31:11.008Z] time="2021-01-20T23:31:10Z" level=debug msg="Agent listed in Fleet with online status" agentID=476f89a0-5b77-11eb-80c0-4b90fef12d0a hostname=52ac682fd7cf
[2021-01-20T23:31:11.008Z] time="2021-01-20T23:31:10Z" level=info msg="The Agent is in the desired status" elapsedTime=2m5.572121497s hostname=52ac682fd7cf isAgentInStatus=true retries=28 status=offline
[2021-01-20T23:31:11.008Z] OCI runtime exec failed: exec failed: container_linux.go:370: starting container process caused: exec: "elastic-agent": executable file not found in $PATH: unknown
[2021-01-20T23:31:11.266Z] time="2021-01-20T23:31:11Z" level=error msg="Could not execute command in container" command="[elastic-agent uninstall -f]" error="Could not run compose file: [/var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-630/.op/compose/profiles/fleet/docker-compose.yml /var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-630/.op/compose/services/debian-systemd/docker-compose.yml] - Local Docker compose exited abnormally whilst running docker-compose: [exec -T debian-systemd elastic-agent uninstall -f]. exit status 126" service=debian-systemd
[2021-01-20T23:31:11.266Z] time="2021-01-20T23:31:11Z" level=error msg="Could not run agent command in the box" command="[elastic-agent uninstall -f]" error="Could not run compose file: [/var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-630/.op/compose/profiles/fleet/docker-compose.yml /var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-630/.op/compose/services/debian-systemd/docker-compose.yml] - Local Docker compose exited abnormally whilst running docker-compose: [exec -T debian-systemd elastic-agent uninstall -f]. exit status 126" profile=fleet service=debian-systemd
[2021-01-20T23:31:11.266Z] time="2021-01-20T23:31:11Z" level=error msg="Could not uninstall the agent"
[2021-01-20T23:31:11.266Z] time="2021-01-20T23:31:11Z" level=debug msg="Un-enrolling agent in Fleet" agentID=476f89a0-5b77-11eb-80c0-4b90fef12d0a hostname=52ac682fd7cf
[2021-01-20T23:31:12.201Z] time="2021-01-20T23:31:12Z" level=debug msg="Fleet agent was unenrolled" agentID=476f89a0-5b77-11eb-80c0-4b90fef12d0a
[2021-01-20T23:31:13.134Z] Stopping fleet_debian-systemd_elastic-agent_1 ... 
[2021-01-20T23:31:13.649Z] 
Stopping fleet_debian-systemd_elastic-agent_1 ... done
Removing fleet_debian-systemd_elastic-agent_1 ... 
[2021-01-20T23:31:13.649Z] 
Removing fleet_debian-systemd_elastic-agent_1 ... done
Going to remove fleet_debian-systemd_elastic-agent_1
[2021-01-20T23:31:13.649Z] time="2021-01-20T23:31:13Z" level=debug msg="Docker compose executed." cmd="[rm -fvs debian-systemd]" composeFilePaths="[/var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-630/.op/compose/profiles/fleet/docker-compose.yml /var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-630/.op/compose/services/debian-systemd/docker-compose.yml]" env="map[centos_systemdAgentBinarySrcPath:/tmp/elastic-agent-8.0.0-SNAPSHOT-linux-x86_64.tar.gz476617134 centos_systemdAgentBinaryTargetPath:/elastic-agent-8.0.0-SNAPSHOT-x86_64.tar.gz centos_systemdContainerName:fleet_centos-systemd_elastic-agent_1 centos_systemdTag:latest debian_systemdAgentBinarySrcPath:/tmp/elastic-agent-8.0.0-SNAPSHOT-linux-x86_64.tar.gz476617134 debian_systemdAgentBinaryTargetPath:/elastic-agent-8.0.0-SNAPSHOT-x86_64.tar.gz debian_systemdContainerName:fleet_debian-systemd_elastic-agent_1 debian_systemdTag:stretch kibanaConfigPath:/var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-630/src/github.com/elastic/e2e-testing/e2e/_suites/fleet/configurations/kibana.config.yml stackVersion:8.0.0-SNAPSHOT]" profile=fleet
[2021-01-20T23:31:13.649Z] time="2021-01-20T23:31:13Z" level=debug msg="Service removed from compose" profile=fleet service=debian-systemd
[2021-01-20T23:31:14.216Z] time="2021-01-20T23:31:14Z" level=debug msg="The token was deleted" tokenID=437f1590-5b77-11eb-80c0-4b90fef12d0a
[2021-01-20T23:31:14.216Z] time="2021-01-20T23:31:14Z" level=info msg="Integration deleted from the configuration" integration= packageConfigId= policyID=6b4a3a20-5b75-11eb-80c0-4b90fef12d0a version=
[2021-01-20T23:31:14.216Z] time="2021-01-20T23:31:14Z" level=debug msg="Destroying Fleet runtime dependencies"
[2021-01-20T23:31:15.157Z] Stopping fleet_kibana_1           ... 
[2021-01-20T23:31:15.157Z] Stopping fleet_elasticsearch_1    ... 
[2021-01-20T23:31:15.157Z] Stopping fleet_package-registry_1 ... 
[2021-01-20T23:31:16.240Z] 
Stopping fleet_kibana_1           ... done

Stopping fleet_package-registry_1 ... done

Stopping fleet_elasticsearch_1    ... done
Removing fleet_kibana_1           ... 
[2021-01-20T23:31:16.240Z] Removing fleet_elasticsearch_1    ... 
[2021-01-20T23:31:16.240Z] Removing fleet_package-registry_1 ... 
[2021-01-20T23:31:16.240Z] 
Removing fleet_package-registry_1 ... done

Removing fleet_kibana_1           ... done

Removing fleet_elasticsearch_1    ... done
Removing network fleet_default
[2021-01-20T23:31:16.499Z] time="2021-01-20T23:31:16Z" level=debug msg="Docker compose executed." cmd="[down --remove-orphans]" composeFilePaths="[/var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-630/.op/compose/profiles/fleet/docker-compose.yml]" env="map[centos_systemdAgentBinarySrcPath:/tmp/elastic-agent-8.0.0-SNAPSHOT-linux-x86_64.tar.gz476617134 centos_systemdAgentBinaryTargetPath:/elastic-agent-8.0.0-SNAPSHOT-x86_64.tar.gz centos_systemdContainerName:fleet_centos-systemd_elastic-agent_1 centos_systemdTag:latest debian_systemdAgentBinarySrcPath:/tmp/elastic-agent-8.0.0-SNAPSHOT-linux-x86_64.tar.gz476617134 debian_systemdAgentBinaryTargetPath:/elastic-agent-8.0.0-SNAPSHOT-x86_64.tar.gz debian_systemdContainerName:fleet_debian-systemd_elastic-agent_1 debian_systemdTag:stretch kibanaConfigPath:/var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-630/src/github.com/elastic/e2e-testing/e2e/_suites/fleet/configurations/kibana.config.yml stackVersion:8.0.0-SNAPSHOT]" profile=fleet
[2021-01-20T23:31:16.499Z] time="2021-01-20T23:31:16Z" level=debug msg="Elastic Agent binary was removed." installer=centos-tar path=/tmp/elastic-agent-8.0.0-SNAPSHOT-linux-x86_64.tar.gz476617134
[2021-01-20T23:31:16.499Z] time="2021-01-20T23:31:16Z" level=debug msg="Elastic Agent binary was removed." installer=debian-systemd path=/tmp/elastic-agent-8.0.0-SNAPSHOT-amd64.deb933796149
[2021-01-20T23:31:16.499Z] time="2021-01-20T23:31:16Z" level=debug msg="Elastic Agent binary was removed." installer=centos-systemd path=/tmp/elastic-agent-8.0.0-SNAPSHOT-x86_64.rpm129149131
[2021-01-20T23:31:16.550Z] [INFO] Stopping Filebeat Docker container
[2021-01-20T23:31:16.831Z] + docker exec -t 94dee18de7a976e26b0a3dcb00397ccdd4185039f6184b968c990c472440b06c chmod -R ugo+rw /output
[2021-01-20T23:31:17.089Z] + docker stop --time 30 94dee18de7a976e26b0a3dcb00397ccdd4185039f6184b968c990c472440b06c
[2021-01-20T23:31:17.347Z] 94dee18de7a976e26b0a3dcb00397ccdd4185039f6184b968c990c472440b06c
[2021-01-20T23:31:17.362Z] Archiving artifacts
[2021-01-20T23:31:17.922Z] Recording test results
[2021-01-20T23:31:18.178Z] [Checks API] No suitable checks publisher found.
[2021-01-20T23:31:18.192Z] Archiving artifacts
[2021-01-20T23:31:19.274Z] Stage "Release" skipped due to earlier failure(s)
[2021-01-20T23:31:20.272Z] Running on worker-1244230 in /var/lib/jenkins/workspace/e2e-tests_e2e-testing-mbp_PR-630
[2021-01-20T23:31:20.309Z] [INFO] getVaultSecret: Getting secrets
[2021-01-20T23:31:20.390Z] Masking supported pattern matches of $VAULT_ADDR or $VAULT_ROLE_ID or $VAULT_SECRET_ID
[2021-01-20T23:31:22.281Z] + chmod 755 generate-build-data.sh
[2021-01-20T23:31:22.281Z] + ./generate-build-data.sh https://beats-ci.elastic.co/blue/rest/organizations/jenkins/pipelines/e2e-tests/e2e-testing-mbp/PR-630/ https://beats-ci.elastic.co/blue/rest/organizations/jenkins/pipelines/e2e-tests/e2e-testing-mbp/PR-630/runs/3 FAILURE 1362117
[2021-01-20T23:31:22.281Z] INFO: curl https://beats-ci.elastic.co/blue/rest/organizations/jenkins/pipelines/e2e-tests/e2e-testing-mbp/PR-630/runs/3/steps/?limit=10000 -o steps-info.json
[2021-01-20T23:31:26.380Z] INFO: curl https://beats-ci.elastic.co/blue/rest/organizations/jenkins/pipelines/e2e-tests/e2e-testing-mbp/PR-630/runs/3/tests/?status=FAILED -o tests-errors.json
[2021-01-20T23:31:27.079Z] INFO: curl https://beats-ci.elastic.co/blue/rest/organizations/jenkins/pipelines/e2e-tests/e2e-testing-mbp/PR-630/runs/3/log/ -o pipeline-log.txt

🐛 Flaky test report

❕ There are test failures but not known flaky tests.

Expand to view the summary

Test stats 🧪

Test Results
Failed 1
Passed 96
Skipped 19
Total 116

Genuine test errors 1

💔 There are test failures but not known flaky tests, most likely a genuine test failure.

  • Name: Initializing / End-To-End Tests / metricbeat_integrations && ceph / ceph-master-97985eb-nautilus-centos-7-x86_64 sends metrics to Elasticsearch without errors – Integrations

@mdelapenya mdelapenya marked this pull request as ready for review January 22, 2021 01:09
@mdelapenya mdelapenya mentioned this pull request Jan 22, 2021
8 tasks
@mdelapenya
Copy link
Contributor Author

Merging, as we are skipping ceph. See #635

@mdelapenya mdelapenya merged commit 4fd0c23 into elastic:master Jan 22, 2021
mdelapenya added a commit to mdelapenya/e2e-testing that referenced this pull request Jan 25, 2021
…lastic#630)

* fix: initialise variable within the inner scope

This caused the last element in a compose file with multiple services to
override the first one

* chore: support passing target file when writing it

* chore: write unit tests for the very basic behavior os sanitising compose files

* chore: enrich unit test to cover processing multiple child items under a service
mdelapenya added a commit to mdelapenya/e2e-testing that referenced this pull request Jan 25, 2021
…lastic#630)

* fix: initialise variable within the inner scope

This caused the last element in a compose file with multiple services to
override the first one

* chore: support passing target file when writing it

* chore: write unit tests for the very basic behavior os sanitising compose files

* chore: enrich unit test to cover processing multiple child items under a service
mdelapenya added a commit to mdelapenya/e2e-testing that referenced this pull request Jan 25, 2021
…lastic#630)

* fix: initialise variable within the inner scope

This caused the last element in a compose file with multiple services to
override the first one

* chore: support passing target file when writing it

* chore: write unit tests for the very basic behavior os sanitising compose files

* chore: enrich unit test to cover processing multiple child items under a service
mdelapenya added a commit to mdelapenya/e2e-testing that referenced this pull request Jan 25, 2021
…lastic#630)

* fix: initialise variable within the inner scope

This caused the last element in a compose file with multiple services to
override the first one

* chore: support passing target file when writing it

* chore: write unit tests for the very basic behavior os sanitising compose files

* chore: enrich unit test to cover processing multiple child items under a service
mdelapenya added a commit to mdelapenya/e2e-testing that referenced this pull request Jan 25, 2021
…lastic#630)

* fix: initialise variable within the inner scope

This caused the last element in a compose file with multiple services to
override the first one

* chore: support passing target file when writing it

* chore: write unit tests for the very basic behavior os sanitising compose files

* chore: enrich unit test to cover processing multiple child items under a service
mdelapenya added a commit to mdelapenya/e2e-testing that referenced this pull request Jan 25, 2021
…lastic#630)

* fix: initialise variable within the inner scope

This caused the last element in a compose file with multiple services to
override the first one

* chore: support passing target file when writing it

* chore: write unit tests for the very basic behavior os sanitising compose files

* chore: enrich unit test to cover processing multiple child items under a service
mdelapenya added a commit to mdelapenya/e2e-testing that referenced this pull request Jan 25, 2021
…lastic#630)

* fix: initialise variable within the inner scope

This caused the last element in a compose file with multiple services to
override the first one

* chore: support passing target file when writing it

* chore: write unit tests for the very basic behavior os sanitising compose files

* chore: enrich unit test to cover processing multiple child items under a service
mdelapenya added a commit to mdelapenya/e2e-testing that referenced this pull request Jan 25, 2021
…lastic#630)

* fix: initialise variable within the inner scope

This caused the last element in a compose file with multiple services to
override the first one

* chore: support passing target file when writing it

* chore: write unit tests for the very basic behavior os sanitising compose files

* chore: enrich unit test to cover processing multiple child items under a service
mdelapenya added a commit that referenced this pull request Jan 25, 2021
…630) (#647)

* fix: initialise variable within the inner scope

This caused the last element in a compose file with multiple services to
override the first one

* chore: support passing target file when writing it

* chore: write unit tests for the very basic behavior os sanitising compose files

* chore: enrich unit test to cover processing multiple child items under a service
mdelapenya added a commit that referenced this pull request Jan 25, 2021
…630) (#648)

* fix: initialise variable within the inner scope

This caused the last element in a compose file with multiple services to
override the first one

* chore: support passing target file when writing it

* chore: write unit tests for the very basic behavior os sanitising compose files

* chore: enrich unit test to cover processing multiple child items under a service
mdelapenya added a commit that referenced this pull request Jan 25, 2021
…630) (#649)

* fix: initialise variable within the inner scope

This caused the last element in a compose file with multiple services to
override the first one

* chore: support passing target file when writing it

* chore: write unit tests for the very basic behavior os sanitising compose files

* chore: enrich unit test to cover processing multiple child items under a service
mdelapenya added a commit that referenced this pull request Jan 25, 2021
…630) (#646)

* fix: initialise variable within the inner scope

This caused the last element in a compose file with multiple services to
override the first one

* chore: support passing target file when writing it

* chore: write unit tests for the very basic behavior os sanitising compose files

* chore: enrich unit test to cover processing multiple child items under a service
@mdelapenya mdelapenya deleted the fix-ceph-sync branch January 26, 2021 16:04
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants