Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Connectors] ServiceNow ITSM & SIR Application #105440

Merged
merged 101 commits into from
Oct 12, 2021

Conversation

cnasikas
Copy link
Member

@cnasikas cnasikas commented Jul 13, 2021

Summary

This PR is the implementation of https://github.com/elastic/security-team/issues/1454 and https://github.com/elastic/security-team/issues/1769.

doc link: https://kibana_105440.docs-preview.app.elstc.co/guide/en/kibana/master/action-types.html

Deferring items

  • ServiceNow button UI changes: It is already in progress (cnasikas/kibana@sn_import_set...semd:1769_connectors_ui_changes) in another PR (it is branched on this PR) along with some changes the Design team asked for.
  • Rename isLegacy to isDepreacted and make it global. I think we need to discuss offline it in more detail to understand your needs.
  • Add, if possible, the version of the ServiceNow application in the connector's "state" at the creation of the connector.
  • Create generics for the services to avoid casting the params.
  • Add an integration check that simulates the HTML page that appears when a person instance is in hibernation mode.
  • Check for the context-type to see if the response is an HTML page and maybe return a better message.

Features:

New connectors

  • Users can create only new connectors.
  • A new connector uses the new ServiceNow application. Users without the application installed in their ServiceNow (SN) instances cannot create a connector.
  • The new connectors use the Import Set API and interact with our SN application.
  • ITSM connectors use the "Elastic for ITSM" application.
  • SecOps connectors use the "Elastic for Security Operations" application.
  • Connector's flyout has a warning callout informing the user to install the SN application.

Old connectors

  • Connectors created prior to this PR are marked as legacy.
  • Old connectors function as before.
  • An old connector uses the SN Table API.
  • An old connector can be updated to a new connector.
  • Connector's flyout has a warning callout informing the user that the connector is deprecated.

Alerts

  • SN SecOps will be enabled for Alerts. No observables can be added from within the alerting framework.
  • New connectors can update incidents at the user's will.
  • An alert icon is added at the right of the connector's name on the connectors' list to indicate deprecated connectors.

Cases

  • Cases created prior to this PR will work as expected.
  • A user cannot create a new case and assign it to an old connector.
  • A user cannot edit a case and change the connector to an old one.
  • An alert icon is added at the right of the connector's name on the configuration page to indicate deprecated connectors.
  • New connectors will use the new API to bulk add observables from within cases.

Screenshots:

Connector:

Screenshot 2021-09-27 at 7 16 01 PM


Screenshot 2021-09-27 at 7 15 54 PM


Screenshot 2021-09-27 at 7 15 42 PM


Screenshot 2021-09-27 at 7 15 28 PM


Screenshot 2021-09-27 at 7 15 20 PM


Screenshot 2021-09-29 at 6 02 04 PM

Cases:

Screenshot 2021-10-01 at 3 21 13 PM


Screenshot 2021-10-01 at 3 21 21 PM


Screenshot 2021-09-27 at 7 16 34 PM


Screenshot 2021-09-27 at 7 16 20 PM


Screenshot 2021-09-27 at 7 16 12 PM

Checklist

Delete any items that are not applicable to this PR.

For maintainers

@cnasikas cnasikas added v8.0.0 Team:ResponseOps Label for the ResponseOps team (formerly the Cases and Alerting teams) Team:Threat Hunting Security Solution Threat Hunting Team Team: SecuritySolution Security Solutions Team working on SIEM, Endpoint, Timeline, Resolver, etc. labels Jul 13, 2021
@cnasikas cnasikas self-assigned this Jul 13, 2021
@cnasikas cnasikas changed the title [Actions] ServiceNow: Import Set Web Service [Actions][skip-ci] ServiceNow: Import Set Web Service Jul 13, 2021
@cnasikas cnasikas force-pushed the sn_import_set branch 2 times, most recently from 37737f4 to a7a09e5 Compare July 25, 2021 11:15
@cnasikas cnasikas force-pushed the sn_import_set branch 3 times, most recently from 42327a5 to 9fef005 Compare August 5, 2021 15:24
@elastic elastic deleted a comment from kibanamachine Aug 30, 2021
@cnasikas cnasikas force-pushed the sn_import_set branch 2 times, most recently from 439421f to 836a3b2 Compare September 6, 2021 10:19
@alexfrancoeur
Copy link

Linking to the design issue: https://github.com/elastic/stack-design-team/issues/109

@pmuellr
Copy link
Member

pmuellr commented Oct 11, 2021

I just opened Allow connector UX to display warnings about connectors #114507 to discuss making the warning icon a framework-level capability.

Copy link
Member

@pmuellr pmuellr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm marked this as approved - per separate conversation, the issues I'd brought up in the previous review can be deferred so that we can get this one merged, but would like them listed in a section in the first comment as a list of items to finish completing.

@cnasikas
Copy link
Member Author

I'm marked this as approved - per separate conversation, the issues I'd brought up in the previous review can be deferred so that we can get this one merged, but would like them listed in a section in the first comment as a list of items to finish completing.

Thank you Patrick! I added the deferred items to the description.

const showLegacyTooltip =
itemConfig?.isLegacy &&
// TODO: Remove when applications are certified
((ENABLE_NEW_SN_ITSM_CONNECTOR && item.actionTypeId === '.servicenow') ||
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For now this approach should be OK, but we might think about the more generic way of exposing this check. But this is on the alerting team to provide this ability.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you, Yuliia!

Copy link
Contributor

@YulNaumenko YulNaumenko left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Contributor

@jonathan-buttner jonathan-buttner left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I left a few nits, I think they can all be addressed in the next PR. I also tested the changes. Nice work!

* of our ServiceNow application
*/

describe('config', () => {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: do we need these unit tests? The configurations seem to be only objects and not functions.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The only reason I put these tests there is because the configuration is very important (a wrong table name will make the whole connector impossible to use). I wanted in case of a change the developer to be notified somehow and think twice about the change. Do you think I can do that otherwise? With TS for example?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Gotcha, leaving them is fine.

* 2.0.
*/

// TODO: Remove when Elastic for ITSM is published.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Christos, is the plan to set these to false before feature freeze?

* The user can not use a legacy connector
*/

export const connectorValidator = (
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are we using the result message Deprecated connector anywhere? We probably should translate it right?

If not, I think it'd probably be clearer if we renamed this to something like isLegacyConnector and return true or false.

We can fix this in the follow up PR.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Each connector validator is used by the Kibana form lib. To indicate an error you have to return an object with a message:

export interface ValidationError<T = string> {
  message: string;
  code?: T;
  validationType?: string;
  __isBlocking__?: boolean;
  [key: string]: any;
}

Normally this text will be shown under the field as a form error. In a previous PR, the design team did not want the message to be shown because we already show a callout with a message. Do you want me to translate it eitherway?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Gotcha, na I don't think we need to translate it if we're not going to show it. I would maybe put a comment as to why it isn't translated.

const abortCtrl = new AbortController();
fetchMock.mockResolvedValueOnce(applicationInfoResponse);

try {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If this test doesn't throw then it will succeed. It might be better to wrap the await call in an expect if the test should always throw

something like:

expect.assertions(1)
await expect(getAppInfo({...}))
    .rejects
    .toThrow();

}
});

it('returns an error when parsing the json fails', async () => {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm I don't see any difference between this test and the one above. Do we need it? Should we add in a mockImplementation for a json parse failure?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bad copy paste from Swimlane 😄

});

test('should return true if there is failure status', async () => {
// @ts-expect-error
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

super nit: it might be more clear to cast {status: 'failure'} as the type we need as it takes me a second to figure out where the error would be coming from.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it is better to use @ts-expect-error from casting because if there is a change in the interface and there is no error, TS will report that @ts-expect-error was not necessary. What do you think? Is there any other way I can make the test clearer?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah good point. I guess the reason that it wasn't clear to me is that when reviewing the code, if I only look at the tests, I don't immediately know why there will be a ts error. I had to go look at the function's implementation to figure it out. It's in the next file though so not a huge deal haha.


export const choicesToEuiOptions = (choices: Choice[]): EuiSelectOption[] =>
choices.map((choice) => ({ value: choice.value, text: choice.label }));

export const isRESTApiError = (res: AppInfo | RESTApiError): res is RESTApiError =>
Copy link
Contributor

@jonathan-buttner jonathan-buttner Oct 11, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We might be able to avoid the expect ts error in the tests by making res: unknown here and doing additional checks for error.message and error.details

Copy link
Member Author

@cnasikas cnasikas Oct 12, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am not against it. What is the benefit of not having strict types of what we expect? Or what is the benefit of having it in the first place 😋 ?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah strict types is probably better haha.

(key: string, value: string) => editActionSecrets(key, value),
[editActionSecrets]
const afterActionConnectorSave = useCallback(async () => {
// TODO: Implement
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we need to implement this?

Copy link
Member Author

@cnasikas cnasikas Oct 12, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nope 🙂 ! Fixed!

export const isRESTApiError = (res: AppInfo | RESTApiError): res is RESTApiError =>
(res as RESTApiError).error != null || (res as RESTApiError).status === 'failure';

export const isFieldInvalid = (
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will a field always been invalid if it is undefined? Or is that only true for required fields?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Before the component mounts the validation runs. The fields are undefined and for that reason, the error is not empty. We do not want to show an error for this scenario. If the user "touches" the text field then the value will be an empty string (aka not undefined) or it will be what the user typed. In this scenario, we want to show an error because the user interacted with the form.

})
);

try {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: might be clearer if we switch this to use the expect.assertions(#) and await expect(...).rejects.toThrow(...)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You are right! A much better pattern.

@cnasikas
Copy link
Member Author

@jonathan-buttner

Christos, is the plan to set these to false before feature freeze?

For some reason, I cannot answer below your comment. This will remain true after FF so the QA can test the new functionality. If for some reason we are not certified until the last BC then we gonna set it to false.

@cnasikas
Copy link
Member Author

@elasticmachine merge upstream

@cnasikas
Copy link
Member Author

@elasticmachine merge upstream

@kibanamachine
Copy link
Contributor

💛 Build succeeded, but was flaky


Test Failures

Kibana Pipeline / general / Chrome X-Pack UI Functional Tests.x-pack/test/functional/apps/ml/data_frame_analytics/feature_importance·ts.machine learning data frame analytics total feature importance panel and decision path popover binary classification job should display the total feature importance in the results view

Link to Jenkins

Standard Out

Failed Tests Reporter:
  - Test has not failed recently on tracked branches

[00:00:00]     │
[00:00:00]       └-: machine learning
[00:00:00]         └-> "before all" hook in "machine learning"
[00:00:00]         └-: 
[00:00:00]           └-> "before all" hook in ""
[00:00:00]           └-> "before all" hook in ""
[00:00:00]             │ debg creating role ft_ml_source
[00:00:00]             │ info [o.e.x.s.a.r.TransportPutRoleAction] [node-01] added role [ft_ml_source]
[00:00:00]             │ debg creating role ft_ml_source_readonly
[00:00:00]             │ info [o.e.x.s.a.r.TransportPutRoleAction] [node-01] added role [ft_ml_source_readonly]
[00:00:00]             │ debg creating role ft_ml_dest
[00:00:00]             │ info [o.e.x.s.a.r.TransportPutRoleAction] [node-01] added role [ft_ml_dest]
[00:00:00]             │ debg creating role ft_ml_dest_readonly
[00:00:00]             │ info [o.e.x.s.a.r.TransportPutRoleAction] [node-01] added role [ft_ml_dest_readonly]
[00:00:00]             │ debg creating role ft_ml_ui_extras
[00:00:00]             │ info [o.e.x.s.a.r.TransportPutRoleAction] [node-01] added role [ft_ml_ui_extras]
[00:00:00]             │ debg creating role ft_default_space_ml_all
[00:00:00]             │ info [o.e.x.s.a.r.TransportPutRoleAction] [node-01] added role [ft_default_space_ml_all]
[00:00:00]             │ debg creating role ft_default_space1_ml_all
[00:00:00]             │ info [o.e.x.s.a.r.TransportPutRoleAction] [node-01] added role [ft_default_space1_ml_all]
[00:00:00]             │ debg creating role ft_all_spaces_ml_all
[00:00:00]             │ info [o.e.x.s.a.r.TransportPutRoleAction] [node-01] added role [ft_all_spaces_ml_all]
[00:00:00]             │ debg creating role ft_default_space_ml_read
[00:00:00]             │ info [o.e.x.s.a.r.TransportPutRoleAction] [node-01] added role [ft_default_space_ml_read]
[00:00:00]             │ debg creating role ft_default_space1_ml_read
[00:00:00]             │ info [o.e.x.s.a.r.TransportPutRoleAction] [node-01] added role [ft_default_space1_ml_read]
[00:00:00]             │ debg creating role ft_all_spaces_ml_read
[00:00:00]             │ info [o.e.x.s.a.r.TransportPutRoleAction] [node-01] added role [ft_all_spaces_ml_read]
[00:00:00]             │ debg creating role ft_default_space_ml_none
[00:00:00]             │ info [o.e.x.s.a.r.TransportPutRoleAction] [node-01] added role [ft_default_space_ml_none]
[00:00:00]             │ debg creating user ft_ml_poweruser
[00:00:00]             │ info [o.e.x.s.a.u.TransportPutUserAction] [node-01] added user [ft_ml_poweruser]
[00:00:00]             │ debg created user ft_ml_poweruser
[00:00:00]             │ debg creating user ft_ml_poweruser_spaces
[00:00:00]             │ info [o.e.x.s.a.u.TransportPutUserAction] [node-01] added user [ft_ml_poweruser_spaces]
[00:00:00]             │ debg created user ft_ml_poweruser_spaces
[00:00:00]             │ debg creating user ft_ml_poweruser_space1
[00:00:00]             │ info [o.e.x.s.a.u.TransportPutUserAction] [node-01] added user [ft_ml_poweruser_space1]
[00:00:00]             │ debg created user ft_ml_poweruser_space1
[00:00:00]             │ debg creating user ft_ml_poweruser_all_spaces
[00:00:00]             │ info [o.e.x.s.a.u.TransportPutUserAction] [node-01] added user [ft_ml_poweruser_all_spaces]
[00:00:00]             │ debg created user ft_ml_poweruser_all_spaces
[00:00:00]             │ debg creating user ft_ml_viewer
[00:00:01]             │ info [o.e.x.s.a.u.TransportPutUserAction] [node-01] added user [ft_ml_viewer]
[00:00:01]             │ debg created user ft_ml_viewer
[00:00:01]             │ debg creating user ft_ml_viewer_spaces
[00:00:01]             │ info [o.e.x.s.a.u.TransportPutUserAction] [node-01] added user [ft_ml_viewer_spaces]
[00:00:01]             │ debg created user ft_ml_viewer_spaces
[00:00:01]             │ debg creating user ft_ml_viewer_space1
[00:00:01]             │ info [o.e.x.s.a.u.TransportPutUserAction] [node-01] added user [ft_ml_viewer_space1]
[00:00:01]             │ debg created user ft_ml_viewer_space1
[00:00:01]             │ debg creating user ft_ml_viewer_all_spaces
[00:00:01]             │ info [o.e.x.s.a.u.TransportPutUserAction] [node-01] added user [ft_ml_viewer_all_spaces]
[00:00:01]             │ debg created user ft_ml_viewer_all_spaces
[00:00:01]             │ debg creating user ft_ml_unauthorized
[00:00:01]             │ info [o.e.x.s.a.u.TransportPutUserAction] [node-01] added user [ft_ml_unauthorized]
[00:00:01]             │ debg created user ft_ml_unauthorized
[00:00:01]             │ debg creating user ft_ml_unauthorized_spaces
[00:00:01]             │ info [o.e.x.s.a.u.TransportPutUserAction] [node-01] added user [ft_ml_unauthorized_spaces]
[00:00:01]             │ debg created user ft_ml_unauthorized_spaces
[00:47:03]           └-: data frame analytics
[00:47:03]             └-> "before all" hook in "data frame analytics"
[00:53:29]             └-: total feature importance panel and decision path popover
[00:53:29]               └-> "before all" hook in "total feature importance panel and decision path popover"
[00:53:29]               └-> "before all" hook in "total feature importance panel and decision path popover"
[00:53:29]                 │ debg applying update to kibana config: {"dateFormat:tz":"UTC"}
[00:53:29]                 │ debg SecurityPage.forceLogout
[00:53:29]                 │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=100
[00:53:29]                 │ debg --- retry.tryForTime error: .login-form is not displayed
[00:53:30]                 │ debg Redirecting to /logout to force the logout
[00:53:30]                 │ debg Waiting on the login form to appear
[00:53:30]                 │ debg Waiting for Login Page to appear.
[00:53:30]                 │ debg Waiting up to 100000ms for login page...
[00:53:30]                 │ debg browser[INFO] http://localhost:61191/logout?_t=1634055331785 281 Refused to execute inline script because it violates the following Content Security Policy directive: "script-src 'unsafe-eval' 'self'". Either the 'unsafe-inline' keyword, a hash ('sha256-P5polb1UreUSOe5V/Pv7tc+yeZuJXiOi/3fqhGsU7BE='), or a nonce ('nonce-...') is required to enable inline execution.
[00:53:30]                 │
[00:53:30]                 │ debg browser[INFO] http://localhost:61191/bootstrap.js 41:19 "^ A single error about an inline script not firing due to content security policy is expected!"
[00:53:30]                 │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:53:32]                 │ERROR browser[SEVERE] http://localhost:61191/api/alerts/list_alert_types - Failed to load resource: the server responded with a status of 401 (Unauthorized)
[00:53:32]                 │ debg browser[INFO] http://localhost:61191/47098/bundles/core/core.entry.js 7:106874 "Detected an unhandled Promise rejection.
[00:53:32]                 │      Error: Unauthorized"
[00:53:32]                 │ERROR browser[SEVERE] http://localhost:61191/47098/bundles/core/core.entry.js 7:52880 
[00:53:32]                 │ debg browser[INFO] http://localhost:61191/login?msg=LOGGED_OUT 281 Refused to execute inline script because it violates the following Content Security Policy directive: "script-src 'unsafe-eval' 'self'". Either the 'unsafe-inline' keyword, a hash ('sha256-P5polb1UreUSOe5V/Pv7tc+yeZuJXiOi/3fqhGsU7BE='), or a nonce ('nonce-...') is required to enable inline execution.
[00:53:32]                 │
[00:53:32]                 │ debg browser[INFO] http://localhost:61191/bootstrap.js 41:19 "^ A single error about an inline script not firing due to content security policy is expected!"
[00:53:32]                 │ERROR browser[SEVERE] http://localhost:61191/api/licensing/info - Failed to load resource: the server responded with a status of 401 (Unauthorized)
[00:53:32]                 │ debg TestSubjects.exists(loginForm)
[00:53:32]                 │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="loginForm"]') with timeout=2500
[00:53:32]                 │ debg Waiting for Login Form to appear.
[00:53:32]                 │ debg Waiting up to 100000ms for login form...
[00:53:32]                 │ debg TestSubjects.exists(loginForm)
[00:53:32]                 │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="loginForm"]') with timeout=2500
[00:53:33]                 │ debg TestSubjects.setValue(loginUsername, ft_ml_poweruser)
[00:53:33]                 │ debg TestSubjects.click(loginUsername)
[00:53:33]                 │ debg Find.clickByCssSelector('[data-test-subj="loginUsername"]') with timeout=10000
[00:53:33]                 │ debg Find.findByCssSelector('[data-test-subj="loginUsername"]') with timeout=10000
[00:53:33]                 │ debg TestSubjects.setValue(loginPassword, mlp001)
[00:53:33]                 │ debg TestSubjects.click(loginPassword)
[00:53:33]                 │ debg Find.clickByCssSelector('[data-test-subj="loginPassword"]') with timeout=10000
[00:53:33]                 │ debg Find.findByCssSelector('[data-test-subj="loginPassword"]') with timeout=10000
[00:53:33]                 │ debg TestSubjects.click(loginSubmit)
[00:53:33]                 │ debg Find.clickByCssSelector('[data-test-subj="loginSubmit"]') with timeout=10000
[00:53:33]                 │ debg Find.findByCssSelector('[data-test-subj="loginSubmit"]') with timeout=10000
[00:53:33]                 │ debg Waiting for login result, expected: chrome.
[00:53:33]                 │ debg Find.findByCssSelector('[data-test-subj="userMenuAvatar"]') with timeout=20000
[00:53:33]                 │ proc [kibana] [2021-10-12T16:15:34.767+00:00][INFO ][plugins.security.routes] Logging in with provider "basic" (basic)
[00:53:35]                 │ debg browser[INFO] http://localhost:61191/app/home 281 Refused to execute inline script because it violates the following Content Security Policy directive: "script-src 'unsafe-eval' 'self'". Either the 'unsafe-inline' keyword, a hash ('sha256-P5polb1UreUSOe5V/Pv7tc+yeZuJXiOi/3fqhGsU7BE='), or a nonce ('nonce-...') is required to enable inline execution.
[00:53:35]                 │
[00:53:35]                 │ debg browser[INFO] http://localhost:61191/bootstrap.js 41:19 "^ A single error about an inline script not firing due to content security policy is expected!"
[00:53:35]                 │ debg Finished login process currentUrl = http://localhost:61191/app/home#/
[00:53:35]                 │ debg Waiting up to 20000ms for logout button visible...
[00:53:35]                 │ debg TestSubjects.exists(userMenuButton)
[00:53:35]                 │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="userMenuButton"]') with timeout=2500
[00:53:35]                 │ debg TestSubjects.exists(userMenu)
[00:53:35]                 │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="userMenu"]') with timeout=2500
[00:53:38]                 │ debg --- retry.tryForTime error: [data-test-subj="userMenu"] is not displayed
[00:53:38]                 │ debg TestSubjects.click(userMenuButton)
[00:53:38]                 │ debg Find.clickByCssSelector('[data-test-subj="userMenuButton"]') with timeout=10000
[00:53:38]                 │ debg Find.findByCssSelector('[data-test-subj="userMenuButton"]') with timeout=10000
[00:53:38]                 │ debg TestSubjects.exists(userMenu)
[00:53:38]                 │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="userMenu"]') with timeout=120000
[00:53:38]                 │ debg TestSubjects.exists(userMenu > logoutLink)
[00:53:38]                 │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="userMenu"] [data-test-subj="logoutLink"]') with timeout=2500
[00:53:38]                 │ info [x-pack/test/functional/es_archives/ml/ihp_outlier] Loading "mappings.json"
[00:53:38]                 │ info [x-pack/test/functional/es_archives/ml/ihp_outlier] Loading "data.json.gz"
[00:53:38]                 │ info [x-pack/test/functional/es_archives/ml/ihp_outlier] Skipped restore for existing index "ft_ihp_outlier"
[00:53:39]                 │ debg Searching for 'index-pattern' with title 'ft_ihp_outlier'...
[00:53:39]                 │ debg  > Found '4cbbf280-2b70-11ec-8289-d524444b36f9'
[00:53:39]                 │ debg Index pattern with title 'ft_ihp_outlier' already exists. Nothing to create.
[00:53:39]                 │ debg Creating data frame analytic job with id 'ihp_fi_binary_1634052120798' ...
[00:53:39]                 │ info [o.e.c.m.MetadataCreateIndexService] [node-01] [.ml-config] creating index, cause [auto(bulk api)], templates [], shards [1]/[1]
[00:53:39]                 │ info [o.e.c.r.a.AllocationService] [node-01] updating number_of_replicas to [0] for indices [.ml-config]
[00:53:39]                 │ info [o.e.c.m.MetadataCreateIndexService] [node-01] [.ml-annotations-6] creating index, cause [api], templates [], shards [1]/[1]
[00:53:39]                 │ info [o.e.c.r.a.AllocationService] [node-01] updating number_of_replicas to [0] for indices [.ml-annotations-6]
[00:53:39]                 │ info [o.e.c.m.MetadataCreateIndexService] [node-01] [.ml-notifications-000002] creating index, cause [auto(bulk api)], templates [.ml-notifications-000002], shards [1]/[1]
[00:53:39]                 │ info [o.e.c.r.a.AllocationService] [node-01] updating number_of_replicas to [0] for indices [.ml-notifications-000002]
[00:53:39]                 │ debg Waiting up to 5000ms for 'ihp_fi_binary_1634052120798' to exist...
[00:53:39]                 │ debg Fetching data frame analytics job 'ihp_fi_binary_1634052120798'...
[00:53:39]                 │ debg > DFA job fetched.
[00:53:39]                 │ debg > DFA job created.
[00:53:39]                 │ debg Starting data frame analytics job 'ihp_fi_binary_1634052120798'...
[00:53:40]                 │ info [o.e.x.m.a.TransportStartDataFrameAnalyticsAction] [node-01] [ihp_fi_binary_1634052120798] Starting data frame analytics from state [stopped]
[00:53:40]                 │ info [o.e.c.m.MetadataCreateIndexService] [node-01] [.ml-inference-000004] creating index, cause [api], templates [], shards [1]/[1]
[00:53:40]                 │ info [o.e.c.r.a.AllocationService] [node-01] updating number_of_replicas to [0] for indices [.ml-inference-000004]
[00:53:40]                 │ info [o.e.x.c.m.u.MlIndexAndAlias] [node-01] About to create first concrete index [.ml-state-000001] with alias [.ml-state-write]
[00:53:40]                 │ debg > DFA job started.
[00:53:40]                 │ debg Waiting up to 60000ms for 'ihp_fi_binary_1634052120798' to have training_docs_count > 0...
[00:53:40]                 │ debg Fetching data frame analytics job stats for job ihp_fi_binary_1634052120798...
[00:53:40]                 │ info [o.e.c.m.MetadataCreateIndexService] [node-01] [.ml-state-000001] creating index, cause [api], templates [.ml-state], shards [1]/[1]
[00:53:40]                 │ info [o.e.c.r.a.AllocationService] [node-01] updating number_of_replicas to [0] for indices [.ml-state-000001]
[00:53:40]                 │ debg > DFA job stats fetched.
[00:53:40]                 │ debg --- retry.waitForWithTimeout error: expected data frame analytics job 'ihp_fi_binary_1634052120798' to have training_docs_count > 0 (got 0)
[00:53:40]                 │ info [o.e.x.c.m.u.MlIndexAndAlias] [node-01] About to create first concrete index [.ml-stats-000001] with alias [.ml-stats-write]
[00:53:40]                 │ info [o.e.c.m.MetadataCreateIndexService] [node-01] [.ml-stats-000001] creating index, cause [api], templates [.ml-stats], shards [1]/[1]
[00:53:40]                 │ info [o.e.c.r.a.AllocationService] [node-01] updating number_of_replicas to [0] for indices [.ml-stats-000001]
[00:53:40]                 │ info [o.e.x.i.IndexLifecycleTransition] [node-01] moving index [.ml-state-000001] from [null] to [{"phase":"new","action":"complete","name":"complete"}] in policy [ml-size-based-ilm-policy]
[00:53:40]                 │ info [o.e.x.i.IndexLifecycleTransition] [node-01] moving index [.ml-stats-000001] from [null] to [{"phase":"new","action":"complete","name":"complete"}] in policy [ml-size-based-ilm-policy]
[00:53:40]                 │ info [o.e.x.m.d.s.ReindexingStep] [node-01] [ihp_fi_binary_1634052120798] Creating destination index [user-ihp_fi_binary_1634052120798]
[00:53:40]                 │ info [o.e.c.m.MetadataCreateIndexService] [node-01] [user-ihp_fi_binary_1634052120798] creating index, cause [api], templates [], shards [1]/[1]
[00:53:40]                 │ info [o.e.x.m.d.s.ReindexingStep] [node-01] [ihp_fi_binary_1634052120798] Started reindexing
[00:53:40]                 │ info [o.e.x.i.IndexLifecycleTransition] [node-01] moving index [.ml-stats-000001] from [{"phase":"new","action":"complete","name":"complete"}] to [{"phase":"hot","action":"unfollow","name":"branch-check-unfollow-prerequisites"}] in policy [ml-size-based-ilm-policy]
[00:53:40]                 │ info [o.e.x.i.IndexLifecycleTransition] [node-01] moving index [.ml-state-000001] from [{"phase":"new","action":"complete","name":"complete"}] to [{"phase":"hot","action":"unfollow","name":"branch-check-unfollow-prerequisites"}] in policy [ml-size-based-ilm-policy]
[00:53:40]                 │ info [o.e.x.i.IndexLifecycleTransition] [node-01] moving index [.ml-stats-000001] from [{"phase":"hot","action":"unfollow","name":"branch-check-unfollow-prerequisites"}] to [{"phase":"hot","action":"rollover","name":"check-rollover-ready"}] in policy [ml-size-based-ilm-policy]
[00:53:40]                 │ info [o.e.x.i.IndexLifecycleTransition] [node-01] moving index [.ml-state-000001] from [{"phase":"hot","action":"unfollow","name":"branch-check-unfollow-prerequisites"}] to [{"phase":"hot","action":"rollover","name":"check-rollover-ready"}] in policy [ml-size-based-ilm-policy]
[00:53:40]                 │ debg Fetching data frame analytics job stats for job ihp_fi_binary_1634052120798...
[00:53:40]                 │ debg > DFA job stats fetched.
[00:53:40]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:53:40]                 │ info [o.e.x.m.d.p.AnalyticsProcessManager] [node-01] [ihp_fi_binary_1634052120798] Started loading data
[00:53:41]                 │ info [o.e.x.m.d.p.AnalyticsProcessManager] [node-01] [ihp_fi_binary_1634052120798] Started analyzing
[00:53:41]                 │ info [o.e.x.m.d.p.AnalyticsProcessManager] [node-01] [ihp_fi_binary_1634052120798] Waiting for result processor to complete
[00:53:41]                 │ debg Fetching data frame analytics job stats for job ihp_fi_binary_1634052120798...
[00:53:41]                 │ debg > DFA job stats fetched.
[00:53:41]                 │ debg Waiting up to 120000ms for analytics state to be stopped...
[00:53:41]                 │ debg Fetching analytics state for job ihp_fi_binary_1634052120798
[00:53:41]                 │ debg Fetching data frame analytics job stats for job ihp_fi_binary_1634052120798...
[00:53:41]                 │ debg > DFA job stats fetched.
[00:53:41]                 │ debg --- retry.waitForWithTimeout error: expected analytics state to be stopped but got started
[00:53:41]                 │ debg Fetching analytics state for job ihp_fi_binary_1634052120798
[00:53:41]                 │ debg Fetching data frame analytics job stats for job ihp_fi_binary_1634052120798...
[00:53:41]                 │ debg > DFA job stats fetched.
[00:53:41]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:53:42]                 │ info [o.e.x.m.p.l.CppLogMessageHandler] [node-01] [ihp_fi_binary_1634052120798] [data_frame_analyzer/256884] [CBoostedTreeImpl.cc@243] Exiting hyperparameter optimisation loop on round 7 out of 18.
[00:53:42]                 │ debg Fetching analytics state for job ihp_fi_binary_1634052120798
[00:53:42]                 │ debg Fetching data frame analytics job stats for job ihp_fi_binary_1634052120798...
[00:53:42]                 │ debg > DFA job stats fetched.
[00:53:42]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:53:42]                 │ debg Fetching analytics state for job ihp_fi_binary_1634052120798
[00:53:42]                 │ debg Fetching data frame analytics job stats for job ihp_fi_binary_1634052120798...
[00:53:42]                 │ debg > DFA job stats fetched.
[00:53:42]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:53:43]                 │ info [o.e.x.m.d.p.ChunkedTrainedModelPersister] [node-01] [ihp_fi_binary_1634052120798] finished storing trained model with id [ihp_fi_binary_1634052120798-1634055344395]
[00:53:43]                 │ info [o.e.x.m.d.p.AnalyticsResultProcessor] [node-01] [ihp_fi_binary_1634052120798] Started writing results
[00:53:43]                 │ info [o.e.x.m.d.p.AnalyticsProcessManager] [node-01] [ihp_fi_binary_1634052120798] Result processor has completed
[00:53:43]                 │ info [o.e.x.m.d.p.AnalyticsProcessManager] [node-01] [ihp_fi_binary_1634052120798] Closing process
[00:53:43]                 │ info [o.e.x.m.p.l.CppLogMessageHandler] [node-01] [ihp_fi_binary_1634052120798] [data_frame_analyzer/256884] [Main.cc@253] [{"name":"E_DFTPMEstimatedPeakMemoryUsage","description":"The upfront estimate of the peak memory training the predictive model would use","value":10150717}
[00:53:43]                 │      ,{"name":"E_DFTPMPeakMemoryUsage","description":"The peak memory training the predictive model used","value":384019}
[00:53:43]                 │      ,{"name":"E_DFTPMTimeToTrain","description":"The time it took to train the predictive model","value":1228}
[00:53:43]                 │      ,{"name":"E_DFTPMTrainedForestNumberTrees","description":"The total number of trees in the trained forest","value":1}
[00:53:43]                 │      ]
[00:53:43]                 │ info [o.e.x.m.p.AbstractNativeProcess] [node-01] [ihp_fi_binary_1634052120798] State output finished
[00:53:43]                 │ info [o.e.x.m.d.p.AnalyticsProcessManager] [node-01] [ihp_fi_binary_1634052120798] Closed process
[00:53:43]                 │ info [o.e.x.m.d.i.InferenceRunner] [node-01] [ihp_fi_binary_1634052120798] Started inference on test data against model [ihp_fi_binary_1634052120798-1634055344395]
[00:53:43]                 │ debg Fetching analytics state for job ihp_fi_binary_1634052120798
[00:53:43]                 │ debg Fetching data frame analytics job stats for job ihp_fi_binary_1634052120798...
[00:53:43]                 │ debg > DFA job stats fetched.
[00:53:43]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:53:43]                 │ info [o.e.x.m.d.DataFrameAnalyticsManager] [node-01] [ihp_fi_binary_1634052120798] Marking task completed
[00:53:43]                 │ debg Fetching analytics state for job ihp_fi_binary_1634052120798
[00:53:43]                 │ debg Fetching data frame analytics job stats for job ihp_fi_binary_1634052120798...
[00:53:43]                 │ debg > DFA job stats fetched.
[00:53:43]                 │ info [x-pack/test/functional/es_archives/ml/ihp_outlier] Loading "mappings.json"
[00:53:43]                 │ info [x-pack/test/functional/es_archives/ml/ihp_outlier] Loading "data.json.gz"
[00:53:43]                 │ info [x-pack/test/functional/es_archives/ml/ihp_outlier] Skipped restore for existing index "ft_ihp_outlier"
[00:53:43]                 │ debg Searching for 'index-pattern' with title 'ft_ihp_outlier'...
[00:53:43]                 │ debg  > Found '4cbbf280-2b70-11ec-8289-d524444b36f9'
[00:53:43]                 │ debg Index pattern with title 'ft_ihp_outlier' already exists. Nothing to create.
[00:53:43]                 │ debg Creating data frame analytic job with id 'ihp_fi_multi_1634052120798' ...
[00:53:44]                 │ debg Waiting up to 5000ms for 'ihp_fi_multi_1634052120798' to exist...
[00:53:44]                 │ debg Fetching data frame analytics job 'ihp_fi_multi_1634052120798'...
[00:53:44]                 │ debg > DFA job fetched.
[00:53:44]                 │ debg > DFA job created.
[00:53:44]                 │ debg Starting data frame analytics job 'ihp_fi_multi_1634052120798'...
[00:53:45]                 │ info [o.e.x.m.a.TransportStartDataFrameAnalyticsAction] [node-01] [ihp_fi_multi_1634052120798] Starting data frame analytics from state [stopped]
[00:53:45]                 │ debg > DFA job started.
[00:53:45]                 │ debg Waiting up to 60000ms for 'ihp_fi_multi_1634052120798' to have training_docs_count > 0...
[00:53:45]                 │ debg Fetching data frame analytics job stats for job ihp_fi_multi_1634052120798...
[00:53:45]                 │ info [o.e.x.m.d.s.ReindexingStep] [node-01] [ihp_fi_multi_1634052120798] Creating destination index [user-ihp_fi_multi_1634052120798]
[00:53:45]                 │ debg > DFA job stats fetched.
[00:53:45]                 │ debg --- retry.waitForWithTimeout error: expected data frame analytics job 'ihp_fi_multi_1634052120798' to have training_docs_count > 0 (got 0)
[00:53:45]                 │ info [o.e.c.m.MetadataCreateIndexService] [node-01] [user-ihp_fi_multi_1634052120798] creating index, cause [api], templates [], shards [1]/[1]
[00:53:45]                 │ info [o.e.x.m.d.s.ReindexingStep] [node-01] [ihp_fi_multi_1634052120798] Started reindexing
[00:53:45]                 │ debg Fetching data frame analytics job stats for job ihp_fi_multi_1634052120798...
[00:53:45]                 │ debg > DFA job stats fetched.
[00:53:45]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:53:45]                 │ info [o.e.x.m.d.p.AnalyticsProcessManager] [node-01] [ihp_fi_multi_1634052120798] Started loading data
[00:53:45]                 │ info [o.e.x.m.d.p.AnalyticsProcessManager] [node-01] [ihp_fi_multi_1634052120798] Started analyzing
[00:53:45]                 │ info [o.e.x.m.d.p.AnalyticsProcessManager] [node-01] [ihp_fi_multi_1634052120798] Waiting for result processor to complete
[00:53:46]                 │ debg Fetching data frame analytics job stats for job ihp_fi_multi_1634052120798...
[00:53:46]                 │ debg > DFA job stats fetched.
[00:53:46]                 │ debg Waiting up to 120000ms for analytics state to be stopped...
[00:53:46]                 │ debg Fetching analytics state for job ihp_fi_multi_1634052120798
[00:53:46]                 │ debg Fetching data frame analytics job stats for job ihp_fi_multi_1634052120798...
[00:53:46]                 │ debg > DFA job stats fetched.
[00:53:46]                 │ debg --- retry.waitForWithTimeout error: expected analytics state to be stopped but got started
[00:53:46]                 │ debg Fetching analytics state for job ihp_fi_multi_1634052120798
[00:53:46]                 │ debg Fetching data frame analytics job stats for job ihp_fi_multi_1634052120798...
[00:53:46]                 │ debg > DFA job stats fetched.
[00:53:46]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:53:47]                 │ debg Fetching analytics state for job ihp_fi_multi_1634052120798
[00:53:47]                 │ debg Fetching data frame analytics job stats for job ihp_fi_multi_1634052120798...
[00:53:47]                 │ debg > DFA job stats fetched.
[00:53:47]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:53:47]                 │ info [o.e.x.m.p.l.CppLogMessageHandler] [node-01] [ihp_fi_multi_1634052120798] [data_frame_analyzer/257195] [CBoostedTreeImpl.cc@243] Exiting hyperparameter optimisation loop on round 7 out of 18.
[00:53:47]                 │ debg Fetching analytics state for job ihp_fi_multi_1634052120798
[00:53:47]                 │ debg Fetching data frame analytics job stats for job ihp_fi_multi_1634052120798...
[00:53:47]                 │ debg > DFA job stats fetched.
[00:53:47]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:53:47]                 │ info [o.e.x.m.d.p.ChunkedTrainedModelPersister] [node-01] [ihp_fi_multi_1634052120798] finished storing trained model with id [ihp_fi_multi_1634052120798-1634055349247]
[00:53:47]                 │ info [o.e.x.m.d.p.AnalyticsResultProcessor] [node-01] [ihp_fi_multi_1634052120798] Started writing results
[00:53:48]                 │ info [o.e.x.m.d.p.AnalyticsProcessManager] [node-01] [ihp_fi_multi_1634052120798] Result processor has completed
[00:53:48]                 │ info [o.e.x.m.d.p.AnalyticsProcessManager] [node-01] [ihp_fi_multi_1634052120798] Closing process
[00:53:48]                 │ info [o.e.x.m.p.l.CppLogMessageHandler] [node-01] [ihp_fi_multi_1634052120798] [data_frame_analyzer/257195] [Main.cc@253] [{"name":"E_DFTPMEstimatedPeakMemoryUsage","description":"The upfront estimate of the peak memory training the predictive model would use","value":22722629}
[00:53:48]                 │      ,{"name":"E_DFTPMPeakMemoryUsage","description":"The peak memory training the predictive model used","value":2173706}
[00:53:48]                 │      ,{"name":"E_DFTPMTimeToTrain","description":"The time it took to train the predictive model","value":1960}
[00:53:48]                 │      ,{"name":"E_DFTPMTrainedForestNumberTrees","description":"The total number of trees in the trained forest","value":5}
[00:53:48]                 │      ]
[00:53:48]                 │ info [o.e.x.m.p.AbstractNativeProcess] [node-01] [ihp_fi_multi_1634052120798] State output finished
[00:53:48]                 │ info [o.e.x.m.d.p.AnalyticsProcessManager] [node-01] [ihp_fi_multi_1634052120798] Closed process
[00:53:48]                 │ info [o.e.x.m.d.i.InferenceRunner] [node-01] [ihp_fi_multi_1634052120798] Started inference on test data against model [ihp_fi_multi_1634052120798-1634055349247]
[00:53:48]                 │ debg Fetching analytics state for job ihp_fi_multi_1634052120798
[00:53:48]                 │ debg Fetching data frame analytics job stats for job ihp_fi_multi_1634052120798...
[00:53:48]                 │ debg > DFA job stats fetched.
[00:53:48]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:53:48]                 │ info [o.e.x.m.d.DataFrameAnalyticsManager] [node-01] [ihp_fi_multi_1634052120798] Marking task completed
[00:53:48]                 │ debg Fetching analytics state for job ihp_fi_multi_1634052120798
[00:53:48]                 │ debg Fetching data frame analytics job stats for job ihp_fi_multi_1634052120798...
[00:53:48]                 │ debg > DFA job stats fetched.
[00:53:48]                 │ info [x-pack/test/functional/es_archives/ml/egs_regression] Loading "mappings.json"
[00:53:48]                 │ info [x-pack/test/functional/es_archives/ml/egs_regression] Loading "data.json.gz"
[00:53:48]                 │ info [x-pack/test/functional/es_archives/ml/egs_regression] Skipped restore for existing index "ft_egs_regression"
[00:53:48]                 │ debg Searching for 'index-pattern' with title 'ft_egs_regression'...
[00:53:48]                 │ debg  > Found 'e5ab0b10-2b76-11ec-8289-d524444b36f9'
[00:53:48]                 │ debg Index pattern with title 'ft_egs_regression' already exists. Nothing to create.
[00:53:48]                 │ debg Creating data frame analytic job with id 'egs_fi_reg_1634052120798' ...
[00:53:49]                 │ debg Waiting up to 5000ms for 'egs_fi_reg_1634052120798' to exist...
[00:53:49]                 │ debg Fetching data frame analytics job 'egs_fi_reg_1634052120798'...
[00:53:49]                 │ debg > DFA job fetched.
[00:53:49]                 │ debg > DFA job created.
[00:53:49]                 │ debg Starting data frame analytics job 'egs_fi_reg_1634052120798'...
[00:53:50]                 │ info [o.e.x.m.a.TransportStartDataFrameAnalyticsAction] [node-01] [egs_fi_reg_1634052120798] Starting data frame analytics from state [stopped]
[00:53:50]                 │ debg > DFA job started.
[00:53:50]                 │ debg Waiting up to 60000ms for 'egs_fi_reg_1634052120798' to have training_docs_count > 0...
[00:53:50]                 │ debg Fetching data frame analytics job stats for job egs_fi_reg_1634052120798...
[00:53:50]                 │ info [o.e.x.m.d.s.ReindexingStep] [node-01] [egs_fi_reg_1634052120798] Creating destination index [user-egs_fi_reg_1634052120798]
[00:53:50]                 │ debg > DFA job stats fetched.
[00:53:50]                 │ debg --- retry.waitForWithTimeout error: expected data frame analytics job 'egs_fi_reg_1634052120798' to have training_docs_count > 0 (got 0)
[00:53:50]                 │ info [o.e.c.m.MetadataCreateIndexService] [node-01] [user-egs_fi_reg_1634052120798] creating index, cause [api], templates [], shards [1]/[1]
[00:53:50]                 │ info [o.e.x.m.d.s.ReindexingStep] [node-01] [egs_fi_reg_1634052120798] Started reindexing
[00:53:50]                 │ debg Fetching data frame analytics job stats for job egs_fi_reg_1634052120798...
[00:53:50]                 │ debg > DFA job stats fetched.
[00:53:50]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:53:50]                 │ info [o.e.x.m.d.p.AnalyticsProcessManager] [node-01] [egs_fi_reg_1634052120798] Started loading data
[00:53:50]                 │ info [o.e.x.m.d.p.AnalyticsProcessManager] [node-01] [egs_fi_reg_1634052120798] Started analyzing
[00:53:50]                 │ info [o.e.x.m.d.p.AnalyticsProcessManager] [node-01] [egs_fi_reg_1634052120798] Waiting for result processor to complete
[00:53:51]                 │ debg Fetching data frame analytics job stats for job egs_fi_reg_1634052120798...
[00:53:51]                 │ debg > DFA job stats fetched.
[00:53:51]                 │ debg Waiting up to 120000ms for analytics state to be stopped...
[00:53:51]                 │ debg Fetching analytics state for job egs_fi_reg_1634052120798
[00:53:51]                 │ debg Fetching data frame analytics job stats for job egs_fi_reg_1634052120798...
[00:53:51]                 │ debg > DFA job stats fetched.
[00:53:51]                 │ debg --- retry.waitForWithTimeout error: expected analytics state to be stopped but got started
[00:53:51]                 │ debg Fetching analytics state for job egs_fi_reg_1634052120798
[00:53:51]                 │ debg Fetching data frame analytics job stats for job egs_fi_reg_1634052120798...
[00:53:51]                 │ debg > DFA job stats fetched.
[00:53:51]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:53:52]                 │ debg Fetching analytics state for job egs_fi_reg_1634052120798
[00:53:52]                 │ debg Fetching data frame analytics job stats for job egs_fi_reg_1634052120798...
[00:53:52]                 │ debg > DFA job stats fetched.
[00:53:52]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:53:52]                 │ debg Fetching analytics state for job egs_fi_reg_1634052120798
[00:53:52]                 │ debg Fetching data frame analytics job stats for job egs_fi_reg_1634052120798...
[00:53:52]                 │ debg > DFA job stats fetched.
[00:53:52]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:53:53]                 │ debg Fetching analytics state for job egs_fi_reg_1634052120798
[00:53:53]                 │ debg Fetching data frame analytics job stats for job egs_fi_reg_1634052120798...
[00:53:53]                 │ debg > DFA job stats fetched.
[00:53:53]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:53:53]                 │ debg Fetching analytics state for job egs_fi_reg_1634052120798
[00:53:53]                 │ debg Fetching data frame analytics job stats for job egs_fi_reg_1634052120798...
[00:53:53]                 │ debg > DFA job stats fetched.
[00:53:53]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:53:54]                 │ debg Fetching analytics state for job egs_fi_reg_1634052120798
[00:53:54]                 │ debg Fetching data frame analytics job stats for job egs_fi_reg_1634052120798...
[00:53:54]                 │ debg > DFA job stats fetched.
[00:53:54]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:53:54]                 │ debg Fetching analytics state for job egs_fi_reg_1634052120798
[00:53:54]                 │ debg Fetching data frame analytics job stats for job egs_fi_reg_1634052120798...
[00:53:54]                 │ debg > DFA job stats fetched.
[00:53:54]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:53:55]                 │ debg Fetching analytics state for job egs_fi_reg_1634052120798
[00:53:55]                 │ debg Fetching data frame analytics job stats for job egs_fi_reg_1634052120798...
[00:53:55]                 │ debg > DFA job stats fetched.
[00:53:55]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:53:55]                 │ debg Fetching analytics state for job egs_fi_reg_1634052120798
[00:53:55]                 │ debg Fetching data frame analytics job stats for job egs_fi_reg_1634052120798...
[00:53:55]                 │ debg > DFA job stats fetched.
[00:53:55]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:53:56]                 │ debg Fetching analytics state for job egs_fi_reg_1634052120798
[00:53:56]                 │ debg Fetching data frame analytics job stats for job egs_fi_reg_1634052120798...
[00:53:56]                 │ debg > DFA job stats fetched.
[00:53:56]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:53:56]                 │ debg Fetching analytics state for job egs_fi_reg_1634052120798
[00:53:56]                 │ debg Fetching data frame analytics job stats for job egs_fi_reg_1634052120798...
[00:53:56]                 │ debg > DFA job stats fetched.
[00:53:56]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:53:57]                 │ debg Fetching analytics state for job egs_fi_reg_1634052120798
[00:53:57]                 │ debg Fetching data frame analytics job stats for job egs_fi_reg_1634052120798...
[00:53:57]                 │ debg > DFA job stats fetched.
[00:53:57]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:53:57]                 │ debg Fetching analytics state for job egs_fi_reg_1634052120798
[00:53:57]                 │ debg Fetching data frame analytics job stats for job egs_fi_reg_1634052120798...
[00:53:57]                 │ debg > DFA job stats fetched.
[00:53:57]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:53:58]                 │ debg Fetching analytics state for job egs_fi_reg_1634052120798
[00:53:58]                 │ debg Fetching data frame analytics job stats for job egs_fi_reg_1634052120798...
[00:53:58]                 │ debg > DFA job stats fetched.
[00:53:58]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:53:58]                 │ debg Fetching analytics state for job egs_fi_reg_1634052120798
[00:53:58]                 │ debg Fetching data frame analytics job stats for job egs_fi_reg_1634052120798...
[00:53:58]                 │ debg > DFA job stats fetched.
[00:53:58]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:53:59]                 │ debg Fetching analytics state for job egs_fi_reg_1634052120798
[00:53:59]                 │ debg Fetching data frame analytics job stats for job egs_fi_reg_1634052120798...
[00:53:59]                 │ debg > DFA job stats fetched.
[00:53:59]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:53:59]                 │ debg Fetching analytics state for job egs_fi_reg_1634052120798
[00:53:59]                 │ debg Fetching data frame analytics job stats for job egs_fi_reg_1634052120798...
[00:53:59]                 │ debg > DFA job stats fetched.
[00:53:59]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:54:00]                 │ debg Fetching analytics state for job egs_fi_reg_1634052120798
[00:54:00]                 │ debg Fetching data frame analytics job stats for job egs_fi_reg_1634052120798...
[00:54:00]                 │ debg > DFA job stats fetched.
[00:54:00]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:54:00]                 │ debg Fetching analytics state for job egs_fi_reg_1634052120798
[00:54:00]                 │ debg Fetching data frame analytics job stats for job egs_fi_reg_1634052120798...
[00:54:00]                 │ debg > DFA job stats fetched.
[00:54:00]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:54:01]                 │ debg Fetching analytics state for job egs_fi_reg_1634052120798
[00:54:01]                 │ debg Fetching data frame analytics job stats for job egs_fi_reg_1634052120798...
[00:54:01]                 │ debg > DFA job stats fetched.
[00:54:01]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:54:01]                 │ debg Fetching analytics state for job egs_fi_reg_1634052120798
[00:54:01]                 │ debg Fetching data frame analytics job stats for job egs_fi_reg_1634052120798...
[00:54:01]                 │ debg > DFA job stats fetched.
[00:54:01]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:54:02]                 │ debg Fetching analytics state for job egs_fi_reg_1634052120798
[00:54:02]                 │ debg Fetching data frame analytics job stats for job egs_fi_reg_1634052120798...
[00:54:02]                 │ debg > DFA job stats fetched.
[00:54:02]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:54:03]                 │ debg Fetching analytics state for job egs_fi_reg_1634052120798
[00:54:03]                 │ debg Fetching data frame analytics job stats for job egs_fi_reg_1634052120798...
[00:54:03]                 │ debg > DFA job stats fetched.
[00:54:03]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:54:03]                 │ debg Fetching analytics state for job egs_fi_reg_1634052120798
[00:54:03]                 │ debg Fetching data frame analytics job stats for job egs_fi_reg_1634052120798...
[00:54:03]                 │ debg > DFA job stats fetched.
[00:54:03]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:54:04]                 │ debg Fetching analytics state for job egs_fi_reg_1634052120798
[00:54:04]                 │ debg Fetching data frame analytics job stats for job egs_fi_reg_1634052120798...
[00:54:04]                 │ debg > DFA job stats fetched.
[00:54:04]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:54:04]                 │ debg Fetching analytics state for job egs_fi_reg_1634052120798
[00:54:04]                 │ debg Fetching data frame analytics job stats for job egs_fi_reg_1634052120798...
[00:54:04]                 │ debg > DFA job stats fetched.
[00:54:04]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:54:05]                 │ debg Fetching analytics state for job egs_fi_reg_1634052120798
[00:54:05]                 │ debg Fetching data frame analytics job stats for job egs_fi_reg_1634052120798...
[00:54:05]                 │ debg > DFA job stats fetched.
[00:54:05]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:54:05]                 │ debg Fetching analytics state for job egs_fi_reg_1634052120798
[00:54:05]                 │ debg Fetching data frame analytics job stats for job egs_fi_reg_1634052120798...
[00:54:05]                 │ debg > DFA job stats fetched.
[00:54:05]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:54:06]                 │ debg Fetching analytics state for job egs_fi_reg_1634052120798
[00:54:06]                 │ debg Fetching data frame analytics job stats for job egs_fi_reg_1634052120798...
[00:54:06]                 │ debg > DFA job stats fetched.
[00:54:06]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:54:06]                 │ debg Fetching analytics state for job egs_fi_reg_1634052120798
[00:54:06]                 │ debg Fetching data frame analytics job stats for job egs_fi_reg_1634052120798...
[00:54:06]                 │ debg > DFA job stats fetched.
[00:54:06]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:54:07]                 │ debg Fetching analytics state for job egs_fi_reg_1634052120798
[00:54:07]                 │ debg Fetching data frame analytics job stats for job egs_fi_reg_1634052120798...
[00:54:07]                 │ debg > DFA job stats fetched.
[00:54:07]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:54:07]                 │ debg Fetching analytics state for job egs_fi_reg_1634052120798
[00:54:07]                 │ debg Fetching data frame analytics job stats for job egs_fi_reg_1634052120798...
[00:54:07]                 │ debg > DFA job stats fetched.
[00:54:07]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:54:07]                 │ info [o.e.x.m.p.l.CppLogMessageHandler] [node-01] [egs_fi_reg_1634052120798] [data_frame_analyzer/257507] [CBoostedTreeImpl.cc@243] Exiting hyperparameter optimisation loop on round 6 out of 16.
[00:54:08]                 │ debg Fetching analytics state for job egs_fi_reg_1634052120798
[00:54:08]                 │ debg Fetching data frame analytics job stats for job egs_fi_reg_1634052120798...
[00:54:08]                 │ debg > DFA job stats fetched.
[00:54:08]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:54:08]                 │ info [o.e.x.m.d.p.ChunkedTrainedModelPersister] [node-01] [egs_fi_reg_1634052120798] finished storing trained model with id [egs_fi_reg_1634052120798-1634055369670]
[00:54:08]                 │ info [o.e.x.m.d.p.AnalyticsResultProcessor] [node-01] [egs_fi_reg_1634052120798] Started writing results
[00:54:08]                 │ debg Fetching analytics state for job egs_fi_reg_1634052120798
[00:54:08]                 │ debg Fetching data frame analytics job stats for job egs_fi_reg_1634052120798...
[00:54:08]                 │ debg > DFA job stats fetched.
[00:54:08]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:54:08]                 │ info [o.e.x.m.d.p.AnalyticsProcessManager] [node-01] [egs_fi_reg_1634052120798] Result processor has completed
[00:54:08]                 │ info [o.e.x.m.d.p.AnalyticsProcessManager] [node-01] [egs_fi_reg_1634052120798] Closing process
[00:54:08]                 │ info [o.e.x.m.p.l.CppLogMessageHandler] [node-01] [egs_fi_reg_1634052120798] [data_frame_analyzer/257507] [Main.cc@253] [{"name":"E_DFTPMEstimatedPeakMemoryUsage","description":"The upfront estimate of the peak memory training the predictive model would use","value":14796184}
[00:54:08]                 │      ,{"name":"E_DFTPMPeakMemoryUsage","description":"The peak memory training the predictive model used","value":7628042}
[00:54:08]                 │      ,{"name":"E_DFTPMTimeToTrain","description":"The time it took to train the predictive model","value":17398}
[00:54:08]                 │      ,{"name":"E_DFTPMTrainedForestNumberTrees","description":"The total number of trees in the trained forest","value":499}
[00:54:08]                 │      ]
[00:54:08]                 │ info [o.e.x.m.p.AbstractNativeProcess] [node-01] [egs_fi_reg_1634052120798] State output finished
[00:54:08]                 │ info [o.e.x.m.d.p.AnalyticsProcessManager] [node-01] [egs_fi_reg_1634052120798] Closed process
[00:54:08]                 │ info [o.e.x.m.d.i.InferenceRunner] [node-01] [egs_fi_reg_1634052120798] Started inference on test data against model [egs_fi_reg_1634052120798-1634055369670]
[00:54:09]                 │ debg Fetching analytics state for job egs_fi_reg_1634052120798
[00:54:09]                 │ debg Fetching data frame analytics job stats for job egs_fi_reg_1634052120798...
[00:54:09]                 │ debg > DFA job stats fetched.
[00:54:09]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:54:09]                 │ debg Fetching analytics state for job egs_fi_reg_1634052120798
[00:54:09]                 │ debg Fetching data frame analytics job stats for job egs_fi_reg_1634052120798...
[00:54:09]                 │ debg > DFA job stats fetched.
[00:54:09]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:54:10]                 │ debg Fetching analytics state for job egs_fi_reg_1634052120798
[00:54:10]                 │ debg Fetching data frame analytics job stats for job egs_fi_reg_1634052120798...
[00:54:10]                 │ debg > DFA job stats fetched.
[00:54:10]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:54:10]                 │ debg Fetching analytics state for job egs_fi_reg_1634052120798
[00:54:10]                 │ debg Fetching data frame analytics job stats for job egs_fi_reg_1634052120798...
[00:54:10]                 │ debg > DFA job stats fetched.
[00:54:10]                 │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:54:11]                 │ info [o.e.x.m.d.DataFrameAnalyticsManager] [node-01] [egs_fi_reg_1634052120798] Marking task completed
[00:54:11]                 │ debg Fetching analytics state for job egs_fi_reg_1634052120798
[00:54:11]                 │ debg Fetching data frame analytics job stats for job egs_fi_reg_1634052120798...
[00:54:11]                 │ debg > DFA job stats fetched.
[00:54:11]               └-: binary classification job
[00:54:11]                 └-> "before all" hook for "should display the total feature importance in the results view"
[00:54:11]                 └-> "before all" hook for "should display the total feature importance in the results view"
[00:54:11]                   │ debg navigating to ml url: http://localhost:61191/app/ml
[00:54:11]                   │ debg navigate to: http://localhost:61191/app/ml
[00:54:11]                   │ debg browser[INFO] http://localhost:61191/app/ml?_t=1634055372486 281 Refused to execute inline script because it violates the following Content Security Policy directive: "script-src 'unsafe-eval' 'self'". Either the 'unsafe-inline' keyword, a hash ('sha256-P5polb1UreUSOe5V/Pv7tc+yeZuJXiOi/3fqhGsU7BE='), or a nonce ('nonce-...') is required to enable inline execution.
[00:54:11]                   │
[00:54:11]                   │ debg browser[INFO] http://localhost:61191/bootstrap.js 41:19 "^ A single error about an inline script not firing due to content security policy is expected!"
[00:54:11]                   │ debg ... sleep(700) start
[00:54:12]                   │ debg ... sleep(700) end
[00:54:12]                   │ debg returned from get, calling refresh
[00:54:13]                   │ debg browser[INFO] http://localhost:61191/app/ml?_t=1634055372486 281 Refused to execute inline script because it violates the following Content Security Policy directive: "script-src 'unsafe-eval' 'self'". Either the 'unsafe-inline' keyword, a hash ('sha256-P5polb1UreUSOe5V/Pv7tc+yeZuJXiOi/3fqhGsU7BE='), or a nonce ('nonce-...') is required to enable inline execution.
[00:54:13]                   │
[00:54:13]                   │ debg browser[INFO] http://localhost:61191/bootstrap.js 41:19 "^ A single error about an inline script not firing due to content security policy is expected!"
[00:54:13]                   │ debg currentUrl = http://localhost:61191/app/ml
[00:54:13]                   │          appUrl = http://localhost:61191/app/ml
[00:54:13]                   │ debg TestSubjects.find(kibanaChrome)
[00:54:13]                   │ debg Find.findByCssSelector('[data-test-subj="kibanaChrome"]') with timeout=60000
[00:54:13]                   │ debg ... sleep(501) start
[00:54:14]                   │ debg ... sleep(501) end
[00:54:14]                   │ debg in navigateTo url = http://localhost:61191/app/ml/overview
[00:54:14]                   │ debg --- retry.tryForTime error: URL changed, waiting for it to settle
[00:54:14]                   │ debg ... sleep(501) start
[00:54:15]                   │ debg ... sleep(501) end
[00:54:15]                   │ debg in navigateTo url = http://localhost:61191/app/ml/overview
[00:54:15]                   │ debg TestSubjects.exists(mlApp)
[00:54:15]                   │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="mlApp"]') with timeout=2000
[00:54:15]                   │ debg TestSubjects.click(~mlMainTab & ~dataFrameAnalytics)
[00:54:15]                   │ debg Find.clickByCssSelector('[data-test-subj~="mlMainTab"][data-test-subj~="dataFrameAnalytics"]') with timeout=10000
[00:54:15]                   │ debg Find.findByCssSelector('[data-test-subj~="mlMainTab"][data-test-subj~="dataFrameAnalytics"]') with timeout=10000
[00:54:15]                   │ debg TestSubjects.exists(~mlMainTab & ~dataFrameAnalytics & ~selected)
[00:54:15]                   │ debg Find.existsByDisplayedByCssSelector('[data-test-subj~="mlMainTab"][data-test-subj~="dataFrameAnalytics"][data-test-subj~="selected"]') with timeout=120000
[00:54:15]                   │ debg TestSubjects.exists(mlPageDataFrameAnalytics)
[00:54:15]                   │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="mlPageDataFrameAnalytics"]') with timeout=120000
[00:54:15]                   │ debg TestSubjects.exists(~mlAnalyticsTable)
[00:54:15]                   │ debg Find.existsByDisplayedByCssSelector('[data-test-subj~="mlAnalyticsTable"]') with timeout=60000
[00:54:15]                   │ debg TestSubjects.exists(mlAnalyticsTable loaded)
[00:54:15]                   │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="mlAnalyticsTable loaded"]') with timeout=30000
[00:54:15]                   │ debg Searching for 'index-pattern' with title 'user-ihp_fi_binary_1634052120798'...
[00:54:15]                   │ debg  > Not found
[00:54:15]                   │ debg Creating index pattern with title 'user-ihp_fi_binary_1634052120798'
[00:54:15]                   │ debg Waiting up to 5000ms for index-pattern with title 'user-ihp_fi_binary_1634052120798' to exist...
[00:54:15]                   │ debg Searching for 'index-pattern' with title 'user-ihp_fi_binary_1634052120798'...
[00:54:15]                   │ debg  > Found 'ba80d270-2b77-11ec-8289-d524444b36f9'
[00:54:15]                   │ debg  > Created with id 'ba80d270-2b77-11ec-8289-d524444b36f9'
[00:54:15]                   │ debg TestSubjects.exists(~mlAnalyticsTable > ~row-ihp_fi_binary_1634052120798 > mlAnalyticsJobViewButton)
[00:54:15]                   │ debg Find.existsByDisplayedByCssSelector('[data-test-subj~="mlAnalyticsTable"] [data-test-subj~="row-ihp_fi_binary_1634052120798"] [data-test-subj="mlAnalyticsJobViewButton"]') with timeout=120000
[00:54:15]                   │ debg TestSubjects.click(~mlAnalyticsTable > ~row-ihp_fi_binary_1634052120798 > mlAnalyticsJobViewButton)
[00:54:15]                   │ debg Find.clickByCssSelector('[data-test-subj~="mlAnalyticsTable"] [data-test-subj~="row-ihp_fi_binary_1634052120798"] [data-test-subj="mlAnalyticsJobViewButton"]') with timeout=10000
[00:54:15]                   │ debg Find.findByCssSelector('[data-test-subj~="mlAnalyticsTable"] [data-test-subj~="row-ihp_fi_binary_1634052120798"] [data-test-subj="mlAnalyticsJobViewButton"]') with timeout=10000
[00:54:16]                   │ debg TestSubjects.exists(mlPageDataFrameAnalyticsExploration)
[00:54:16]                   │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="mlPageDataFrameAnalyticsExploration"]') with timeout=20000
[00:54:16]                 └-> should display the total feature importance in the results view
[00:54:16]                   └-> "before each" hook: global before each for "should display the total feature importance in the results view"
[00:54:16]                   │ debg TestSubjects.exists(mlDFExpandableSection-FeatureImportanceSummary)
[00:54:16]                   │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="mlDFExpandableSection-FeatureImportanceSummary"]') with timeout=120000
[00:54:16]                   │ debg TestSubjects.exists(mlTotalFeatureImportanceChart)
[00:54:16]                   │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="mlTotalFeatureImportanceChart"]') with timeout=5000
[00:54:19]                   │ debg --- retry.tryForTime error: [data-test-subj="mlTotalFeatureImportanceChart"] is not displayed
[00:54:22]                   │ debg --- retry.tryForTime failed again with the same message...
[00:54:23]                   │ info Taking screenshot "/dev/shm/workspace/parallel/19/kibana/x-pack/test/functional/screenshots/failure/machine learning  data frame analytics total feature importance panel and decision path popover binary classification job should display the total feature importance in the results view.png"
[00:54:23]                   │ info Current URL is: http://localhost:61191/app/ml/data_frame_analytics/exploration?_g=(ml%3A(analysisType%3Aclassification%2CjobId%3Aihp_fi_binary_1634052120798))
[00:54:23]                   │ info Saving page source to: /dev/shm/workspace/parallel/19/kibana/x-pack/test/functional/failure_debug/html/machine learning  data frame analytics total feature importance panel and decision path popover binary classification job should display the total feature importance in the results view.html
[00:54:23]                   └- ✖ fail: machine learning  data frame analytics total feature importance panel and decision path popover binary classification job should display the total feature importance in the results view
[00:54:23]                   │      Error: expected testSubject(mlTotalFeatureImportanceChart) to exist
[00:54:23]                   │       at TestSubjects.existOrFail (/dev/shm/workspace/parallel/19/kibana/test/functional/services/common/test_subjects.ts:45:13)
[00:54:23]                   │       at Object.assertTotalFeatureImportanceEvaluatePanelExists (test/functional/services/ml/data_frame_analytics_results.ts:76:7)
[00:54:23]                   │       at Context.<anonymous> (test/functional/apps/ml/data_frame_analytics/feature_importance.ts:201:11)
[00:54:23]                   │       at Object.apply (/dev/shm/workspace/parallel/19/kibana/node_modules/@kbn/test/target_node/functional_test_runner/lib/mocha/wrap_function.js:87:16)
[00:54:23]                   │ 
[00:54:23]                   │ 

Stack Trace

Error: expected testSubject(mlTotalFeatureImportanceChart) to exist
    at TestSubjects.existOrFail (/dev/shm/workspace/parallel/19/kibana/test/functional/services/common/test_subjects.ts:45:13)
    at Object.assertTotalFeatureImportanceEvaluatePanelExists (test/functional/services/ml/data_frame_analytics_results.ts:76:7)
    at Context.<anonymous> (test/functional/apps/ml/data_frame_analytics/feature_importance.ts:201:11)
    at Object.apply (/dev/shm/workspace/parallel/19/kibana/node_modules/@kbn/test/target_node/functional_test_runner/lib/mocha/wrap_function.js:87:16)

Metrics [docs]

Module Count

Fewer modules leads to a faster build time

id before after diff
cases 335 338 +3
triggersActionsUi 366 376 +10
total +13

Public APIs missing comments

Total count of every public API that lacks a comment. Target amount is 0. Run node scripts/build_api_docs --plugin [yourplugin] --stats comments for more detailed information.

id before after diff
cases 431 432 +1
triggersActionsUi 230 229 -1
total -0

Async chunks

Total size of all lazy-loaded chunks that will be downloaded as the user navigates the app

id before after diff
cases 310.4KB 316.6KB +6.2KB
triggersActionsUi 762.7KB 787.5KB +24.8KB
uptime 568.6KB 568.6KB +45.0B
total +31.0KB

Page load bundle

Size of the bundles that are downloaded on every page load. Target size is below 100kb

id before after diff
cases 80.5KB 80.5KB +5.0B
securitySolution 104.0KB 104.0KB +18.0B
triggersActionsUi 51.0KB 50.6KB -314.0B
total -291.0B
Unknown metric groups

API count

id before after diff
cases 475 476 +1
triggersActionsUi 239 238 -1
total -0

History

To update your PR or re-run it, just comment with:
@elasticmachine merge upstream

cc @cnasikas

@cnasikas cnasikas added the auto-backport Deprecated - use backport:version if exact versions are needed label Oct 12, 2021
@cnasikas cnasikas merged commit 7ffebf1 into elastic:master Oct 12, 2021
@cnasikas cnasikas deleted the sn_import_set branch October 12, 2021 17:58
@kibanamachine
Copy link
Contributor

💔 Backport failed

Status Branch Result
7.x Commit could not be cherrypicked due to conflicts

To backport manually run:
node scripts/backport --pr 105440

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
auto-backport Deprecated - use backport:version if exact versions are needed docs release_note:enhancement Team:ResponseOps Label for the ResponseOps team (formerly the Cases and Alerting teams) Team: SecuritySolution Security Solutions Team working on SIEM, Endpoint, Timeline, Resolver, etc. Team:Threat Hunting Security Solution Threat Hunting Team Team:Uptime - DEPRECATED Synthetics & RUM sub-team of Application Observability v7.16.0 v8.0.0
Projects
None yet
Development

Successfully merging this pull request may close these issues.