-
Notifications
You must be signed in to change notification settings - Fork 8.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[ML] Explain log rate spikes: Move API stream demos to Kibana examples. #132590
Conversation
d300eda
to
7408858
Compare
51667c2
to
29b34ce
Compare
Pinging @elastic/ml-ui (:ml) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The changes related to a new package creation LGTM
examples/response_stream/public/containers/app/pages/page_reducer_stream/index.tsx
Show resolved
Hide resolved
@qn895 addressed your comments, ready for another look! |
💚 Build SucceededMetrics [docs]Module Count
Public APIs missing comments
Async chunks
Page load bundle
Unknown metric groupsAPI count
async chunk count
ESLint disabled line counts
Total ESLint disabled count
History
To update your PR or re-run it, just comment with: cc @walterra |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Tested locally and LGTM. Just one suggestion for the title.
|
||
developerExamples.register({ | ||
appId: 'response-stream', | ||
title: 'response stream', |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd go with Response stream
as the title for the example.
Latest changes LGTM 🎉 Amazing work!! |
…amples. (#182690) ## Summary Follow up to #132590. Part of #181111. This updates the developer examples for `@kbn/ml-response-stream` to include a variant with a full Redux Toolkit setup. For this case, the `@kbn/ml-response-stream` now includes a generic slice `streamSlice` that can be used. This allows the actions created to be streamed via NDJSON to be shared across server and client. Functional tests for the examples were added too. To run these tests you can use the following commands: ``` # Start the test server (can continue running) node scripts/functional_tests_server.js --config test/examples/config.js # Start a test run node scripts/functional_test_runner.js --config test/examples/config.js ``` ### Checklist - [x] [Unit or functional tests](https://www.elastic.co/guide/en/kibana/master/development-tests.html) were updated or added to match the most common scenarios - [x] This was checked for breaking API changes and was [labeled appropriately](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process)
Summary
Part of #136265.
Follow up to #132121.
To run Kibana with the described examples, use
yarn start --run-examples
.This creates a
response_stream
plugin in the Kibana/examples
section. The plugin demonstrates API endpoints that can stream data chunks with a single request with gzip/compression support. gzip-streams get decompressed natively by browsers. The plugin demonstrates two use cases to get started: Streaming a raw string as well as a more complex example that streams Redux-like actions to the client which update React state viauseReducer()
.Code in
@kbn/aiops-utils
contains helpers to set up a stream on the server side (streamFactory()
) and consume it on the client side via a custom hook (useFetchStream()
). The utilities make use of TS generics in a way that allows to have type safety for both request related options as well as the returned data.No additional third party libraries are used in the helpers to make it work. On the server, they integrate with
Hapi
and use node's owngzip
. On the client, the custom hook abstracts away the necessary logic to consume the stream, internally it makes use of a generator function anduseReducer()
to update React state.On the server, the simpler stream to send a string is set up like this:
The request's headers get passed on to automatically identify if compression is supported by the client.
On the client, the custom hook is used like this:
Checklist
For maintainers