Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[8.1](backport #30529) Add parsers examples to filestream reference configuration #30537

Merged
merged 1 commit into from
Feb 22, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
95 changes: 95 additions & 0 deletions filebeat/_meta/config/filebeat.inputs.reference.yml.tmpl
Original file line number Diff line number Diff line change
Expand Up @@ -293,6 +293,101 @@ filebeat.inputs:
# original for harvesting but will report the symlink name as source.
#prospector.scanner.symlinks: false

### Parsers configuration

#### JSON configuration

#parsers:
#- ndjson:
# Decode JSON options. Enable this if your logs are structured in JSON.
# JSON key on which to apply the line filtering and multiline settings. This key
# must be top level and its value must be a string, otherwise it is ignored. If
# no text key is defined, the line filtering and multiline features cannot be used.
#message_key:

# By default, the decoded JSON is placed under a "json" key in the output document.
# If you enable this setting, the keys are copied to the top level of the output document.
#keys_under_root: false

# If keys_under_root and this setting are enabled, then the values from the decoded
# JSON object overwrite the fields that Filebeat normally adds (type, source, offset, etc.)
# in case of conflicts.
#overwrite_keys: false

# If this setting is enabled, then keys in the decoded JSON object will be recursively
# de-dotted, and expanded into a hierarchical object structure.
# For example, `{"a.b.c": 123}` would be expanded into `{"a":{"b":{"c":123}}}`.
#expand_keys: false

# If this setting is enabled, Filebeat adds an "error.message" and "error.key: json" key in case of JSON
# unmarshaling errors or when a text key is defined in the configuration but cannot
# be used.
#add_error_key: false

#### Multiline options

# Multiline can be used for log messages spanning multiple lines. This is common
# for Java Stack Traces or C-Line Continuation

#parsers:
#- multiline:
#type: pattern
# The regexp Pattern that has to be matched. The example pattern matches all lines starting with [
#pattern: ^\[

# Defines if the pattern set under the pattern setting should be negated or not. Default is false.
#negate: false

# Match can be set to "after" or "before". It is used to define if lines should be append to a pattern
# that was (not) matched before or after or as long as a pattern is not matched based on negate.
# Note: After is the equivalent to previous and before is the equivalent to next in Logstash
#match: after

# The maximum number of lines that are combined to one event.
# In case there are more than max_lines the additional lines are discarded.
# Default is 500
#max_lines: 500

# After the defined timeout, a multiline event is sent even if no new pattern was found to start a new event
# Default is 5s.
#timeout: 5s

# Do not add new line character when concatenating lines.
#skip_newline: false

# To aggregate constant number of lines into a single event use the count mode of multiline.

#parsers:
#- multiline:
#type: count

# The number of lines to aggregate into a single event.
#count_lines: 3

# The maximum number of lines that are combined to one event.
# In case there are more than max_lines the additional lines are discarded.
# Default is 500
#max_lines: 500

# After the defined timeout, an multiline event is sent even if no new pattern was found to start a new event
# Default is 5s.
#timeout: 5s

# Do not add new line character when concatenating lines.
#skip_newline: false

#### Parsing container events

# You can parse container events with different formats from all streams.

#parsers:
#- container:
# Source of container events. Available options: all, stdin, stderr.
#stream: all

# Format of the container events. Available options: auto, cri, docker, json-file
#format: auto

### Log rotation

# When an external tool rotates the input files with copytruncate strategy
Expand Down
95 changes: 95 additions & 0 deletions filebeat/filebeat.reference.yml
Original file line number Diff line number Diff line change
Expand Up @@ -700,6 +700,101 @@ filebeat.inputs:
# original for harvesting but will report the symlink name as source.
#prospector.scanner.symlinks: false

### Parsers configuration

#### JSON configuration

#parsers:
#- ndjson:
# Decode JSON options. Enable this if your logs are structured in JSON.
# JSON key on which to apply the line filtering and multiline settings. This key
# must be top level and its value must be a string, otherwise it is ignored. If
# no text key is defined, the line filtering and multiline features cannot be used.
#message_key:

# By default, the decoded JSON is placed under a "json" key in the output document.
# If you enable this setting, the keys are copied to the top level of the output document.
#keys_under_root: false

# If keys_under_root and this setting are enabled, then the values from the decoded
# JSON object overwrite the fields that Filebeat normally adds (type, source, offset, etc.)
# in case of conflicts.
#overwrite_keys: false

# If this setting is enabled, then keys in the decoded JSON object will be recursively
# de-dotted, and expanded into a hierarchical object structure.
# For example, `{"a.b.c": 123}` would be expanded into `{"a":{"b":{"c":123}}}`.
#expand_keys: false

# If this setting is enabled, Filebeat adds an "error.message" and "error.key: json" key in case of JSON
# unmarshaling errors or when a text key is defined in the configuration but cannot
# be used.
#add_error_key: false

#### Multiline options

# Multiline can be used for log messages spanning multiple lines. This is common
# for Java Stack Traces or C-Line Continuation

#parsers:
#- multiline:
#type: pattern
# The regexp Pattern that has to be matched. The example pattern matches all lines starting with [
#pattern: ^\[

# Defines if the pattern set under the pattern setting should be negated or not. Default is false.
#negate: false

# Match can be set to "after" or "before". It is used to define if lines should be append to a pattern
# that was (not) matched before or after or as long as a pattern is not matched based on negate.
# Note: After is the equivalent to previous and before is the equivalent to next in Logstash
#match: after

# The maximum number of lines that are combined to one event.
# In case there are more than max_lines the additional lines are discarded.
# Default is 500
#max_lines: 500

# After the defined timeout, a multiline event is sent even if no new pattern was found to start a new event
# Default is 5s.
#timeout: 5s

# Do not add new line character when concatenating lines.
#skip_newline: false

# To aggregate constant number of lines into a single event use the count mode of multiline.

#parsers:
#- multiline:
#type: count

# The number of lines to aggregate into a single event.
#count_lines: 3

# The maximum number of lines that are combined to one event.
# In case there are more than max_lines the additional lines are discarded.
# Default is 500
#max_lines: 500

# After the defined timeout, an multiline event is sent even if no new pattern was found to start a new event
# Default is 5s.
#timeout: 5s

# Do not add new line character when concatenating lines.
#skip_newline: false

#### Parsing container events

# You can parse container events with different formats from all streams.

#parsers:
#- container:
# Source of container events. Available options: all, stdin, stderr.
#stream: all

# Format of the container events. Available options: auto, cri, docker, json-file
#format: auto

### Log rotation

# When an external tool rotates the input files with copytruncate strategy
Expand Down
95 changes: 95 additions & 0 deletions x-pack/filebeat/filebeat.reference.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2759,6 +2759,101 @@ filebeat.inputs:
# original for harvesting but will report the symlink name as source.
#prospector.scanner.symlinks: false

### Parsers configuration

#### JSON configuration

#parsers:
#- ndjson:
# Decode JSON options. Enable this if your logs are structured in JSON.
# JSON key on which to apply the line filtering and multiline settings. This key
# must be top level and its value must be a string, otherwise it is ignored. If
# no text key is defined, the line filtering and multiline features cannot be used.
#message_key:

# By default, the decoded JSON is placed under a "json" key in the output document.
# If you enable this setting, the keys are copied to the top level of the output document.
#keys_under_root: false

# If keys_under_root and this setting are enabled, then the values from the decoded
# JSON object overwrite the fields that Filebeat normally adds (type, source, offset, etc.)
# in case of conflicts.
#overwrite_keys: false

# If this setting is enabled, then keys in the decoded JSON object will be recursively
# de-dotted, and expanded into a hierarchical object structure.
# For example, `{"a.b.c": 123}` would be expanded into `{"a":{"b":{"c":123}}}`.
#expand_keys: false

# If this setting is enabled, Filebeat adds an "error.message" and "error.key: json" key in case of JSON
# unmarshaling errors or when a text key is defined in the configuration but cannot
# be used.
#add_error_key: false

#### Multiline options

# Multiline can be used for log messages spanning multiple lines. This is common
# for Java Stack Traces or C-Line Continuation

#parsers:
#- multiline:
#type: pattern
# The regexp Pattern that has to be matched. The example pattern matches all lines starting with [
#pattern: ^\[

# Defines if the pattern set under the pattern setting should be negated or not. Default is false.
#negate: false

# Match can be set to "after" or "before". It is used to define if lines should be append to a pattern
# that was (not) matched before or after or as long as a pattern is not matched based on negate.
# Note: After is the equivalent to previous and before is the equivalent to next in Logstash
#match: after

# The maximum number of lines that are combined to one event.
# In case there are more than max_lines the additional lines are discarded.
# Default is 500
#max_lines: 500

# After the defined timeout, a multiline event is sent even if no new pattern was found to start a new event
# Default is 5s.
#timeout: 5s

# Do not add new line character when concatenating lines.
#skip_newline: false

# To aggregate constant number of lines into a single event use the count mode of multiline.

#parsers:
#- multiline:
#type: count

# The number of lines to aggregate into a single event.
#count_lines: 3

# The maximum number of lines that are combined to one event.
# In case there are more than max_lines the additional lines are discarded.
# Default is 500
#max_lines: 500

# After the defined timeout, an multiline event is sent even if no new pattern was found to start a new event
# Default is 5s.
#timeout: 5s

# Do not add new line character when concatenating lines.
#skip_newline: false

#### Parsing container events

# You can parse container events with different formats from all streams.

#parsers:
#- container:
# Source of container events. Available options: all, stdin, stderr.
#stream: all

# Format of the container events. Available options: auto, cri, docker, json-file
#format: auto

### Log rotation

# When an external tool rotates the input files with copytruncate strategy
Expand Down