Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[8.13](backport #38634) Remove references to min_events in bulk_max_size docs. #38637

Merged
merged 1 commit into from
Mar 27, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 2 additions & 4 deletions libbeat/outputs/elasticsearch/docs/elasticsearch.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -666,10 +666,8 @@ endif::[]

The maximum number of events to bulk in a single Elasticsearch bulk API index request. The default is 1600.

Events can be collected into batches. When using the memory queue with `queue.mem.flush.min_events`
set to a value greater than `1`, the maximum batch is is the value of `queue.mem.flush.min_events`.
{beatname_uc} will split batches read from the queue which are larger than `bulk_max_size` into
multiple batches.
Events can be collected into batches. {beatname_uc} will split batches read from the queue which are
larger than `bulk_max_size` into multiple batches.

Specifying a larger batch size can improve performance by lowering the overhead of sending events.
However big batch sizes can also increase processing times, which might result in
Expand Down
6 changes: 2 additions & 4 deletions libbeat/outputs/logstash/docs/logstash.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -381,10 +381,8 @@ endif::[]

The maximum number of events to bulk in a single {ls} request. The default is 2048.

Events can be collected into batches. When using the memory queue with `queue.mem.flush.min_events`
set to a value greater than `1`, the maximum batch is is the value of `queue.mem.flush.min_events`.
{beatname_uc} will split batches read from the queue which are larger than `bulk_max_size` into
multiple batches.
Events can be collected into batches. {beatname_uc} will split batches read from the queue which are
larger than `bulk_max_size` into multiple batches.

Specifying a larger batch size can improve performance by lowering the overhead of sending events.
However big batch sizes can also increase processing times, which might result in
Expand Down
6 changes: 2 additions & 4 deletions libbeat/outputs/redis/docs/redis.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -216,10 +216,8 @@ endif::[]

The maximum number of events to bulk in a single Redis request or pipeline. The default is 2048.

Events can be collected into batches. When using the memory queue with `queue.mem.flush.min_events`
set to a value greater than `1`, the maximum batch is is the value of `queue.mem.flush.min_events`.
{beatname_uc} will split batches read from the queue which are larger than `bulk_max_size` into
multiple batches.
Events can be collected into batches. {beatname_uc} will split batches read from the queue which are
larger than `bulk_max_size` into multiple batches.

Specifying a larger batch size can improve performance by lowering the overhead
of sending events. However big batch sizes can also increase processing times,
Expand Down
Loading