Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[load testing] add env vars to pass simulations and repo rootPath #89544

Merged
merged 6 commits into from
Feb 1, 2021

Conversation

dmlemeshko
Copy link
Member

@dmlemeshko dmlemeshko commented Jan 28, 2021

Summary

Closes elastic/kibana-load-testing/issues/40

It is possible to run custom simulations via FTR:

# in the same parent dir that the <kibana_repo> is cloned;
git clone [email protected]:elastic/kibana-load-testing.git
git pull 

# add dima's remote within the <kibana_repo>:
git remote add dima [email protected]:dmlemeshko/kibana.git
git checkout -b dima-load-testing-custom-args master
git pull https://github.com/dmlemeshko/kibana.git load-testing-custom-args

# run the ftr from the x-pack directory
pushd <kibana_repo>/x-pack 
GATLING_SIMULATIONS=DemoJourney node scripts/functional_tests --config test/load/config.ts

multiple simulations:

GATLING_SIMULATIONS="DemoJourney,CloudAtOnceJourney" node scripts/functional_tests --config test/load/config.ts

Simulation class should exist in simulation package

Checklist

Delete any items that are not applicable to this PR.

For maintainers

@dmlemeshko dmlemeshko added release_note:skip Skip the PR/issue when compiling release notes v7.12.0 v8.0.0 labels Jan 28, 2021
@dmlemeshko dmlemeshko marked this pull request as ready for review January 28, 2021 16:39
@wayneseymour
Copy link
Member

Ok, the code looks good. Ill try to run it now.

],
cwd: gatlingProjectRootPath,
env: {
...process.env,
},
wait: true,
});
for (const simulationClass of simulations.split(',').filter((i) => i.length > 0)) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: This could be a named fn:

const dropEmptyLines = x => x.split(',').filter(i => i.length > 0);
for (const simulationClass of dropEmptyLines(simulations)) {
...

Not a big deal at all, just a "nit pick" lol.

@wayneseymour
Copy link
Member

@dmlemeshko I tried running it from x-pack and I got errors again.

Ideas?

@dmlemeshko
Copy link
Member Author

Ok, the code looks good. Ill try to run it now.

@wayneseymour did you checkout load-testing repo with the latest master next to kibana repo?

@dmlemeshko
Copy link
Member Author

@elasticmachine merge upstream

@dmlemeshko dmlemeshko force-pushed the load-testing-custom-args branch from dfaaa20 to 586d9c6 Compare February 1, 2021 16:41
className.replace('.', '/') + simulationFIleExtension
);
if (!Fs.existsSync(simulationClassPath)) {
throw createFlagError(`Simulation class is not found: '${simulationClassPath}'`);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

very nice!

@wayneseymour
Copy link
Member

Ok, the code looks good. Ill try to run it now.

@wayneseymour did you checkout load-testing repo with the latest master next to kibana repo?

I do believe so, but I'll try again.

Copy link
Member

@wayneseymour wayneseymour left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

     │ info [o.e.x.w.WatcherLifeCycleService] [les.local] watcher has stopped and shutdown
     │ info [o.e.n.Node] [les.local] stopped
     │ info [o.e.n.Node] [les.local] closing ...
     │ info [o.e.n.Node] [les.local] closed
     │ info [es] stopped
     │ info [es] cleanup complete

LGTM!

@kibanamachine
Copy link
Contributor

💛 Build succeeded, but was flaky


Test Failures

Kibana Pipeline / general / X-Pack Reporting API Integration Tests.x-pack/test/reporting_api_integration/reporting_and_security/csv_job_params·ts.Reporting APIs Generation from Job Params "before all" hook for "Rejects bogus jobParams"

Link to Jenkins

Standard Out

Failed Tests Reporter:
  - Test has not failed recently on tracked branches

[00:00:00]       │
[00:00:00]         └-: Reporting APIs
[00:00:00]           └-> "before all" hook
[00:00:00]           └-: Generation from Job Params
[00:00:00]             └-> "before all" hook
[00:00:00]             └-> "before all" hook
[00:00:00]               │ info [reporting/logs] Loading "mappings.json"
[00:00:00]               │ info [reporting/logs] Loading "data.json.gz"
[00:00:00]               │ info [o.e.c.m.MetadataDeleteIndexService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1612197749758112052] [.kibana_8.0.0_001/UbrS3i7eTa-4rAPlGYt5qg] deleting index
[00:00:00]               │ info [o.e.c.m.MetadataDeleteIndexService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1612197749758112052] [.kibana_task_manager_8.0.0_001/FgLfRffeQXmvf8vJHip9AQ] deleting index
[00:00:00]               │ info [reporting/logs] Deleted existing index [".kibana_8.0.0_001",".kibana_task_manager_8.0.0_001"]
[00:00:00]               │ info [o.e.c.m.MetadataCreateIndexService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1612197749758112052] [.kibana] creating index, cause [api], templates [], shards [1]/[1]
[00:00:00]               │ info [reporting/logs] Created index ".kibana"
[00:00:00]               │ debg [reporting/logs] ".kibana" settings {"index":{"number_of_replicas":"1","number_of_shards":"1"}}
[00:00:00]               │ info [o.e.c.m.MetadataCreateIndexService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1612197749758112052] [.kibana_task_manager_8.0.0_001] creating index, cause [auto(bulk api)], templates [], shards [1]/[1]
[00:00:00]               │ info [o.e.c.m.MetadataMappingService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1612197749758112052] [.kibana/V1_Xde_TSGOPI_F5MuixOg] update_mapping [_doc]
[00:00:00]               │ info [o.e.c.m.MetadataMappingService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1612197749758112052] [.kibana_task_manager_8.0.0_001/5UfI8TFAS4mbxVQ6lAAhWw] create_mapping
[00:00:00]               │ info [reporting/logs] Indexed 2 docs into ".kibana"
[00:00:00]               │ info [o.e.c.m.MetadataMappingService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1612197749758112052] [.kibana_task_manager_8.0.0_001/5UfI8TFAS4mbxVQ6lAAhWw] update_mapping [_doc]
[00:00:00]               │ info [o.e.c.m.MetadataMappingService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1612197749758112052] [.kibana/V1_Xde_TSGOPI_F5MuixOg] update_mapping [_doc]
[00:00:00]               │ debg Migrating saved objects
[00:00:00]               │ info [o.e.c.m.MetadataMappingService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1612197749758112052] [.kibana_task_manager_8.0.0_001/5UfI8TFAS4mbxVQ6lAAhWw] update_mapping [_doc]
[00:00:00]               │ proc [kibana]   log   [17:52:41.992] [error][plugins][taskManager] Failed to poll for work: ResponseError: search_phase_execution_exception
[00:00:00]               │ proc [kibana]   log   [17:52:42.012] [info][savedobjects-service] [.kibana_task_manager] INIT -> CREATE_NEW_TARGET
[00:00:00]               │ proc [kibana]   log   [17:52:42.017] [info][savedobjects-service] [.kibana] INIT -> LEGACY_SET_WRITE_BLOCK
[00:00:00]               │ info [o.e.c.m.MetadataIndexStateService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1612197749758112052] adding block write to indices [[.kibana/V1_Xde_TSGOPI_F5MuixOg]]
[00:00:00]               │ info [o.e.c.m.MetadataIndexStateService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1612197749758112052] completed adding block write to indices [.kibana]
[00:00:00]               │ proc [kibana]   log   [17:52:42.082] [info][savedobjects-service] [.kibana] LEGACY_SET_WRITE_BLOCK -> LEGACY_CREATE_REINDEX_TARGET
[00:00:00]               │ info [o.e.c.m.MetadataCreateIndexService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1612197749758112052] [.kibana_pre6.5.0_001] creating index, cause [api], templates [], shards [1]/[1]
[00:00:00]               │ info [o.e.c.r.a.AllocationService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1612197749758112052] updating number_of_replicas to [0] for indices [.kibana_pre6.5.0_001]
[00:00:00]               │ proc [kibana]   log   [17:52:42.169] [info][savedobjects-service] [.kibana] LEGACY_CREATE_REINDEX_TARGET -> LEGACY_REINDEX
[00:00:00]               │ proc [kibana]   log   [17:52:42.189] [info][savedobjects-service] [.kibana] LEGACY_REINDEX -> LEGACY_REINDEX_WAIT_FOR_TASK
[00:00:00]               │ info [o.e.c.m.MetadataCreateIndexService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1612197749758112052] [.tasks] creating index, cause [auto(task api)], templates [], shards [1]/[1]
[00:00:00]               │ info [o.e.c.r.a.AllocationService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1612197749758112052] updating number_of_replicas to [0] for indices [.tasks]
[00:00:00]               │ info [o.e.t.LoggingTaskListener] [kibana-ci-immutable-ubuntu-18-tests-xxl-1612197749758112052] 820 finished with response BulkByScrollResponse[took=47.6ms,timed_out=false,sliceId=null,updated=0,created=2,deleted=0,batches=1,versionConflicts=0,noops=0,retries=0,throttledUntil=0s,bulk_failures=[],search_failures=[]]
[00:00:01]               │ proc [kibana]   log   [17:52:42.417] [info][savedobjects-service] [.kibana] LEGACY_REINDEX_WAIT_FOR_TASK -> LEGACY_DELETE
[00:00:01]               │ info [o.e.c.m.MetadataDeleteIndexService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1612197749758112052] [.kibana/V1_Xde_TSGOPI_F5MuixOg] deleting index
[00:00:01]               │ proc [kibana]   log   [17:52:42.473] [info][savedobjects-service] [.kibana] LEGACY_DELETE -> SET_SOURCE_WRITE_BLOCK
[00:00:01]               │ info [o.e.c.m.MetadataIndexStateService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1612197749758112052] adding block write to indices [[.kibana_pre6.5.0_001/o7-xH8cKSamGQSB0_Vl8cQ]]
[00:00:01]               │ info [o.e.c.m.MetadataIndexStateService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1612197749758112052] completed adding block write to indices [.kibana_pre6.5.0_001]
[00:00:01]               │ proc [kibana]   log   [17:52:42.526] [info][savedobjects-service] [.kibana] SET_SOURCE_WRITE_BLOCK -> CREATE_REINDEX_TEMP
[00:00:01]               │ info [o.e.c.m.MetadataCreateIndexService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1612197749758112052] [.kibana_8.0.0_reindex_temp] creating index, cause [api], templates [], shards [1]/[1]
[00:00:01]               │ info [o.e.c.r.a.AllocationService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1612197749758112052] updating number_of_replicas to [0] for indices [.kibana_8.0.0_reindex_temp]
[00:00:01]               │ proc [kibana]   log   [17:52:42.604] [info][savedobjects-service] [.kibana] CREATE_REINDEX_TEMP -> REINDEX_SOURCE_TO_TEMP
[00:00:01]               │ proc [kibana]   log   [17:52:42.617] [info][savedobjects-service] [.kibana] REINDEX_SOURCE_TO_TEMP -> REINDEX_SOURCE_TO_TEMP_WAIT_FOR_TASK
[00:00:01]               │ info [o.e.c.m.MetadataMappingService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1612197749758112052] [.kibana_8.0.0_reindex_temp/A2VTWEO6R2m6PwFZTgbamg] update_mapping [_doc]
[00:00:01]               │ info [o.e.t.LoggingTaskListener] [kibana-ci-immutable-ubuntu-18-tests-xxl-1612197749758112052] 865 finished with response BulkByScrollResponse[took=54ms,timed_out=false,sliceId=null,updated=0,created=2,deleted=0,batches=1,versionConflicts=0,noops=0,retries=0,throttledUntil=0s,bulk_failures=[],search_failures=[]]
[00:00:01]               │ proc [kibana]   log   [17:52:42.732] [info][savedobjects-service] [.kibana] REINDEX_SOURCE_TO_TEMP_WAIT_FOR_TASK -> SET_TEMP_WRITE_BLOCK
[00:00:01]               │ info [o.e.c.m.MetadataIndexStateService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1612197749758112052] adding block write to indices [[.kibana_8.0.0_reindex_temp/A2VTWEO6R2m6PwFZTgbamg]]
[00:00:01]               │ info [o.e.c.m.MetadataIndexStateService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1612197749758112052] completed adding block write to indices [.kibana_8.0.0_reindex_temp]
[00:00:01]               │ proc [kibana]   log   [17:52:42.782] [info][savedobjects-service] [.kibana] SET_TEMP_WRITE_BLOCK -> CLONE_TEMP_TO_TARGET
[00:00:01]               │ info [o.e.c.m.MetadataCreateIndexService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1612197749758112052] applying create index request using existing index [.kibana_8.0.0_reindex_temp] metadata
[00:00:01]               │ info [o.e.c.m.MetadataCreateIndexService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1612197749758112052] [.kibana_8.0.0_001] creating index, cause [clone_index], templates [], shards [1]/[1]
[00:00:01]               │ info [o.e.c.r.a.AllocationService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1612197749758112052] updating number_of_replicas to [0] for indices [.kibana_8.0.0_001]
[00:00:01]               │ info [o.e.c.m.MetadataMappingService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1612197749758112052] [.kibana_8.0.0_001/yJ0tcM7yQjmfgJHUtgiaUA] create_mapping
[00:00:01]               │ proc [kibana]   log   [17:52:42.922] [info][savedobjects-service] [.kibana] CLONE_TEMP_TO_TARGET -> OUTDATED_DOCUMENTS_SEARCH
[00:00:01]               │ proc [kibana]   log   [17:52:42.958] [info][savedobjects-service] [.kibana] OUTDATED_DOCUMENTS_SEARCH -> OUTDATED_DOCUMENTS_TRANSFORM
[00:00:01]               │ info [o.e.c.m.MetadataMappingService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1612197749758112052] [.kibana_8.0.0_001/yJ0tcM7yQjmfgJHUtgiaUA] update_mapping [_doc]
[00:00:02]               │ proc [kibana]   log   [17:52:43.857] [info][savedobjects-service] [.kibana] OUTDATED_DOCUMENTS_TRANSFORM -> OUTDATED_DOCUMENTS_SEARCH
[00:00:02]               │ proc [kibana]   log   [17:52:43.879] [info][savedobjects-service] [.kibana] OUTDATED_DOCUMENTS_SEARCH -> UPDATE_TARGET_MAPPINGS
[00:00:02]               │ info [o.e.c.m.MetadataMappingService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1612197749758112052] [.kibana_8.0.0_001/yJ0tcM7yQjmfgJHUtgiaUA] update_mapping [_doc]
[00:00:02]               │ proc [kibana]   log   [17:52:43.970] [info][savedobjects-service] [.kibana] UPDATE_TARGET_MAPPINGS -> UPDATE_TARGET_MAPPINGS_WAIT_FOR_TASK
[00:00:02]               │ info [o.e.t.LoggingTaskListener] [kibana-ci-immutable-ubuntu-18-tests-xxl-1612197749758112052] 926 finished with response BulkByScrollResponse[took=44ms,timed_out=false,sliceId=null,updated=2,created=0,deleted=0,batches=1,versionConflicts=0,noops=0,retries=0,throttledUntil=0s,bulk_failures=[],search_failures=[]]
[00:00:02]               │ proc [kibana]   log   [17:52:44.084] [info][savedobjects-service] [.kibana] UPDATE_TARGET_MAPPINGS_WAIT_FOR_TASK -> MARK_VERSION_INDEX_READY
[00:00:02]               │ info [o.e.c.m.MetadataDeleteIndexService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1612197749758112052] [.kibana_8.0.0_reindex_temp/A2VTWEO6R2m6PwFZTgbamg] deleting index
[00:00:02]               │ proc [kibana]   log   [17:52:44.126] [info][savedobjects-service] [.kibana] MARK_VERSION_INDEX_READY -> DONE
[00:00:02]               │ proc [kibana]   log   [17:52:44.126] [info][savedobjects-service] [.kibana] Migration completed after 2123ms
[00:00:04]               │ info [o.e.c.m.MetadataCreateIndexService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1612197749758112052] [.ds-ilm-history-5-2021.02.01-000001] creating index, cause [initialize_data_stream], templates [ilm-history], shards [1]/[0]
[00:00:04]               │ info [o.e.c.m.MetadataCreateDataStreamService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1612197749758112052] adding data stream [ilm-history-5] with write index [.ds-ilm-history-5-2021.02.01-000001] and backing indices []
[00:00:04]               │ info [o.e.x.i.IndexLifecycleTransition] [kibana-ci-immutable-ubuntu-18-tests-xxl-1612197749758112052] moving index [.ds-ilm-history-5-2021.02.01-000001] from [null] to [{"phase":"new","action":"complete","name":"complete"}] in policy [ilm-history-ilm-policy]
[00:00:04]               │ info [o.e.x.i.IndexLifecycleTransition] [kibana-ci-immutable-ubuntu-18-tests-xxl-1612197749758112052] moving index [.ds-ilm-history-5-2021.02.01-000001] from [{"phase":"new","action":"complete","name":"complete"}] to [{"phase":"hot","action":"unfollow","name":"wait-for-indexing-complete"}] in policy [ilm-history-ilm-policy]
[00:00:04]               │ info [o.e.x.i.IndexLifecycleTransition] [kibana-ci-immutable-ubuntu-18-tests-xxl-1612197749758112052] moving index [.ds-ilm-history-5-2021.02.01-000001] from [{"phase":"hot","action":"unfollow","name":"wait-for-indexing-complete"}] to [{"phase":"hot","action":"unfollow","name":"wait-for-follow-shard-tasks"}] in policy [ilm-history-ilm-policy]
[00:00:57]               │ proc [kibana]   log   [17:53:38.431] [error][plugins][taskManager] [WorkloadAggregator]: Error: Invalid workload: {"took":0,"timed_out":false,"_shards":{"total":0,"successful":0,"skipped":0,"failed":0},"hits":{"total":{"value":0,"relation":"eq"},"max_score":0,"hits":[]}}
[00:01:57]               │ proc [kibana]   log   [17:54:38.443] [error][plugins][taskManager] [WorkloadAggregator]: Error: Invalid workload: {"took":0,"timed_out":false,"_shards":{"total":0,"successful":0,"skipped":0,"failed":0},"hits":{"total":{"value":0,"relation":"eq"},"max_score":0,"hits":[]}}
[00:02:00]               │ERROR [migrate saved objects] request failed (attempt=1/5): socket hang up
[00:02:00]               │ proc [kibana]   log   [17:54:42.029] [error][savedobjects-service] [.kibana_task_manager] Action failed with 'Request timed out'. Retrying attempt 1 out of 10 in 2 seconds.
[00:02:00]               │ proc [kibana]   log   [17:54:42.029] [info][savedobjects-service] [.kibana_task_manager] CREATE_NEW_TARGET -> CREATE_NEW_TARGET
[00:02:01]               │ proc [kibana]   log   [17:54:42.961] [info][savedobjects-service] [.kibana_task_manager] INIT -> CREATE_NEW_TARGET
[00:02:01]               │ proc [kibana]   log   [17:54:42.970] [info][savedobjects-service] [.kibana] INIT -> OUTDATED_DOCUMENTS_SEARCH
[00:02:01]               │ proc [kibana]   log   [17:54:42.984] [info][savedobjects-service] [.kibana] OUTDATED_DOCUMENTS_SEARCH -> UPDATE_TARGET_MAPPINGS
[00:02:01]               │ proc [kibana]   log   [17:54:43.053] [info][savedobjects-service] [.kibana] UPDATE_TARGET_MAPPINGS -> UPDATE_TARGET_MAPPINGS_WAIT_FOR_TASK
[00:02:01]               │ info [o.e.t.LoggingTaskListener] [kibana-ci-immutable-ubuntu-18-tests-xxl-1612197749758112052] 1521 finished with response BulkByScrollResponse[took=38.4ms,timed_out=false,sliceId=null,updated=2,created=0,deleted=0,batches=1,versionConflicts=0,noops=0,retries=0,throttledUntil=0s,bulk_failures=[],search_failures=[]]
[00:02:01]               │ proc [kibana]   log   [17:54:43.162] [info][savedobjects-service] [.kibana] UPDATE_TARGET_MAPPINGS_WAIT_FOR_TASK -> DONE
[00:02:01]               │ proc [kibana]   log   [17:54:43.162] [info][savedobjects-service] [.kibana] Migration completed after 207ms
[00:02:57]               │ proc [kibana]   log   [17:55:38.449] [error][plugins][taskManager] [WorkloadAggregator]: Error: Invalid workload: {"took":0,"timed_out":false,"_shards":{"total":0,"successful":0,"skipped":0,"failed":0},"hits":{"total":{"value":0,"relation":"eq"},"max_score":0,"hits":[]}}
[00:03:57]               │ proc [kibana]   log   [17:56:38.457] [error][plugins][taskManager] [WorkloadAggregator]: Error: Invalid workload: {"took":0,"timed_out":false,"_shards":{"total":0,"successful":0,"skipped":0,"failed":0},"hits":{"total":{"value":0,"relation":"eq"},"max_score":0,"hits":[]}}
[00:04:01]               │ERROR [migrate saved objects] request failed (attempt=2/5): socket hang up
[00:04:01]               │ proc [kibana]   log   [17:56:42.978] [error][savedobjects-service] [.kibana_task_manager] Action failed with 'Request timed out'. Retrying attempt 1 out of 10 in 2 seconds.
[00:04:01]               │ proc [kibana]   log   [17:56:42.979] [info][savedobjects-service] [.kibana_task_manager] CREATE_NEW_TARGET -> CREATE_NEW_TARGET
[00:04:02]               │ proc [kibana]   log   [17:56:44.045] [error][savedobjects-service] [.kibana_task_manager] Action failed with 'Request timed out'. Retrying attempt 2 out of 10 in 4 seconds.
[00:04:02]               │ proc [kibana]   log   [17:56:44.045] [info][savedobjects-service] [.kibana_task_manager] CREATE_NEW_TARGET -> CREATE_NEW_TARGET
[00:04:03]               │ proc [kibana]   log   [17:56:44.983] [info][savedobjects-service] [.kibana_task_manager] INIT -> CREATE_NEW_TARGET
[00:04:03]               │ proc [kibana]   log   [17:56:44.994] [info][savedobjects-service] [.kibana] INIT -> OUTDATED_DOCUMENTS_SEARCH
[00:04:03]               │ proc [kibana]   log   [17:56:45.010] [info][savedobjects-service] [.kibana] OUTDATED_DOCUMENTS_SEARCH -> UPDATE_TARGET_MAPPINGS
[00:04:03]               │ proc [kibana]   log   [17:56:45.075] [info][savedobjects-service] [.kibana] UPDATE_TARGET_MAPPINGS -> UPDATE_TARGET_MAPPINGS_WAIT_FOR_TASK
[00:04:03]               │ info [o.e.t.LoggingTaskListener] [kibana-ci-immutable-ubuntu-18-tests-xxl-1612197749758112052] 2100 finished with response BulkByScrollResponse[took=32.6ms,timed_out=false,sliceId=null,updated=2,created=0,deleted=0,batches=1,versionConflicts=0,noops=0,retries=0,throttledUntil=0s,bulk_failures=[],search_failures=[]]
[00:04:03]               │ proc [kibana]   log   [17:56:45.183] [info][savedobjects-service] [.kibana] UPDATE_TARGET_MAPPINGS_WAIT_FOR_TASK -> DONE
[00:04:03]               │ proc [kibana]   log   [17:56:45.184] [info][savedobjects-service] [.kibana] Migration completed after 206ms
[00:04:57]               │ proc [kibana]   log   [17:57:38.465] [error][plugins][taskManager] [WorkloadAggregator]: Error: Invalid workload: {"took":0,"timed_out":false,"_shards":{"total":0,"successful":0,"skipped":0,"failed":0},"hits":{"total":{"value":0,"relation":"eq"},"max_score":0,"hits":[]}}
[00:05:57]               │ proc [kibana]   log   [17:58:38.470] [error][plugins][taskManager] [WorkloadAggregator]: Error: Invalid workload: {"took":0,"timed_out":false,"_shards":{"total":0,"successful":0,"skipped":0,"failed":0},"hits":{"total":{"value":0,"relation":"eq"},"max_score":0,"hits":[]}}
[00:06:00]               └- ✖ fail: Reporting APIs Generation from Job Params "before all" hook for "Rejects bogus jobParams"
[00:06:00]               │      Error: Timeout of 360000ms exceeded. For async tests and hooks, ensure "done()" is called; if returning a Promise, ensure it resolves. (/dev/shm/workspace/parallel/7/kibana/x-pack/test/reporting_api_integration/reporting_and_security/csv_job_params.ts)
[00:06:00]               │       at listOnTimeout (internal/timers.js:554:17)
[00:06:00]               │       at processTimers (internal/timers.js:497:7)
[00:06:00]               │ 
[00:06:00]               │ 

Stack Trace

Error: Timeout of 360000ms exceeded. For async tests and hooks, ensure "done()" is called; if returning a Promise, ensure it resolves. (/dev/shm/workspace/parallel/7/kibana/x-pack/test/reporting_api_integration/reporting_and_security/csv_job_params.ts)
    at listOnTimeout (internal/timers.js:554:17)
    at processTimers (internal/timers.js:497:7)

Metrics [docs]

✅ unchanged

History

To update your PR or re-run it, just comment with:
@elasticmachine merge upstream

@dmlemeshko
Copy link
Member Author

@dmlemeshko dmlemeshko merged commit 2498f57 into elastic:master Feb 1, 2021
@kibanamachine
Copy link
Contributor

Friendly reminder: Looks like this PR hasn’t been backported yet.
To create backports run node scripts/backport --pr 89544 or prevent reminders by adding the backport:skip label.

@kibanamachine kibanamachine added the backport missing Added to PRs automatically when the are determined to be missing a backport. label Feb 3, 2021
@kibanamachine
Copy link
Contributor

Friendly reminder: Looks like this PR hasn’t been backported yet.
To create backports run node scripts/backport --pr 89544 or prevent reminders by adding the backport:skip label.

3 similar comments
@kibanamachine
Copy link
Contributor

Friendly reminder: Looks like this PR hasn’t been backported yet.
To create backports run node scripts/backport --pr 89544 or prevent reminders by adding the backport:skip label.

@kibanamachine
Copy link
Contributor

Friendly reminder: Looks like this PR hasn’t been backported yet.
To create backports run node scripts/backport --pr 89544 or prevent reminders by adding the backport:skip label.

@kibanamachine
Copy link
Contributor

Friendly reminder: Looks like this PR hasn’t been backported yet.
To create backports run node scripts/backport --pr 89544 or prevent reminders by adding the backport:skip label.

@kibanamachine
Copy link
Contributor

Friendly reminder: Looks like this PR hasn’t been backported yet.
To create backports run node scripts/backport --pr 89544 or prevent reminders by adding the backport:skip label.

6 similar comments
@kibanamachine
Copy link
Contributor

Friendly reminder: Looks like this PR hasn’t been backported yet.
To create backports run node scripts/backport --pr 89544 or prevent reminders by adding the backport:skip label.

@kibanamachine
Copy link
Contributor

Friendly reminder: Looks like this PR hasn’t been backported yet.
To create backports run node scripts/backport --pr 89544 or prevent reminders by adding the backport:skip label.

@kibanamachine
Copy link
Contributor

Friendly reminder: Looks like this PR hasn’t been backported yet.
To create backports run node scripts/backport --pr 89544 or prevent reminders by adding the backport:skip label.

@kibanamachine
Copy link
Contributor

Friendly reminder: Looks like this PR hasn’t been backported yet.
To create backports run node scripts/backport --pr 89544 or prevent reminders by adding the backport:skip label.

@kibanamachine
Copy link
Contributor

Friendly reminder: Looks like this PR hasn’t been backported yet.
To create backports run node scripts/backport --pr 89544 or prevent reminders by adding the backport:skip label.

@kibanamachine
Copy link
Contributor

Friendly reminder: Looks like this PR hasn’t been backported yet.
To create backports run node scripts/backport --pr 89544 or prevent reminders by adding the backport:skip label.

@LeeDr LeeDr added backport:skip This commit does not require backporting and removed backport missing Added to PRs automatically when the are determined to be missing a backport. labels Feb 19, 2021
dmlemeshko added a commit to dmlemeshko/kibana that referenced this pull request Feb 19, 2021
…astic#89544)

* [load testing] add env vars to pass simulations and repo rootPath

* pass simulation to sript as argument

* export GATLING_SIMULATIONS

* fix export

* add validation
@dmlemeshko dmlemeshko removed the backport:skip This commit does not require backporting label Feb 19, 2021
dmlemeshko added a commit that referenced this pull request Feb 20, 2021
…9544) (#92049)

* [load testing] add env vars to pass simulations and repo rootPath

* pass simulation to sript as argument

* export GATLING_SIMULATIONS

* fix export

* add validation
@dmlemeshko dmlemeshko deleted the load-testing-custom-args branch January 31, 2022 12:33
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
release_note:skip Skip the PR/issue when compiling release notes v7.12.0 v8.0.0
Projects
None yet
Development

Successfully merging this pull request may close these issues.

GatlingTestRunner: add cli to run with custom parameters
4 participants