Skip to content

Commit

Permalink
Filebeat: Add support for ISO8601 dates to system.auth (elastic#12579)
Browse files Browse the repository at this point in the history
Followup of elastic#12568 for system/auth.

(cherry picked from commit 0b559ff)
  • Loading branch information
adriansr committed Jun 18, 2019
1 parent 5366e66 commit 820c132
Show file tree
Hide file tree
Showing 5 changed files with 82 additions and 25 deletions.
23 changes: 23 additions & 0 deletions CHANGELOG.next.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -59,6 +59,29 @@ https://github.com/elastic/beats/compare/v7.2.0...7.2[Check the HEAD diff]
*Filebeat*

- Add full ISO8601 date parsing support for system/syslog module. {pull}12568[12568]
- Add more info to message logged when a duplicated symlink file is found {pull}10845[10845]
- Add option to configure docker input with paths {pull}10687[10687]
- Add Netflow module to enrich flow events with geoip data. {pull}10877[10877]
- Set `event.category: network_traffic` for Suricata. {pull}10882[10882]
- Allow custom default settings with autodiscover (for example, use of CRI paths for logs). {pull}12193[12193]
- Allow to disable hints based autodiscover default behavior (fetching all logs). {pull}12193[12193]
- Change Suricata module pipeline to handle `destination.domain` being set if a reverse DNS processor is used. {issue}10510[10510]
- Add the `network.community_id` flow identifier to field to the IPTables, Suricata, and Zeek modules. {pull}11005[11005]
- New Filebeat coredns module to ingest coredns logs. It supports both native coredns deployment and coredns deployment in kubernetes. {pull}11200[11200]
- New module for Cisco ASA logs. {issue}9200[9200] {pull}11171[11171]
- Added support for Cisco ASA fields to the netflow input. {pull}11201[11201]
- Configurable line terminator. {pull}11015[11015]
- Add Filebeat envoyproxy module. {pull}11700[11700]
- Add apache2(httpd) log path (`/var/log/httpd`) to make apache2 module work out of the box on Redhat-family OSes. {issue}11887[11887] {pull}11888[11888]
- Add support to new MongoDB additional diagnostic information {pull}11952[11952]
- New module `panw` for Palo Alto Networks PAN-OS logs. {pull}11999[11999]
- Add RabbitMQ module. {pull}12032[12032]
- Add new `container` input. {pull}12162[12162]
- Add timeouts on communication with docker daemon. {pull}12310[12310]
- `container` and `docker` inputs now support reading of labels and env vars written by docker JSON file logging driver. {issue}8358[8358]
- Add specific date processor to convert timezones so same pipeline can be used when convert_timezone is enabled or disabled. {pull}12253[12253]
- Add MSSQL module {pull}12079[12079]
- Add ISO8601 date parsing support for system module. {pull}12568[12568] {pull}12578[12579]

*Heartbeat*

Expand Down
20 changes: 11 additions & 9 deletions filebeat/module/system/auth/ingest/pipeline.json
Original file line number Diff line number Diff line change
Expand Up @@ -6,16 +6,17 @@
"field": "message",
"ignore_missing": true,
"pattern_definitions" : {
"GREEDYMULTILINE" : "(.|\n)*"
"GREEDYMULTILINE" : "(.|\n)*",
"TIMESTAMP": "(?:%{TIMESTAMP_ISO8601}|%{SYSLOGTIMESTAMP})"
},
"patterns": [
"%{SYSLOGTIMESTAMP:system.auth.timestamp} %{SYSLOGHOST:host.hostname} %{DATA:process.name}(?:\\[%{POSINT:process.pid:long}\\])?: %{DATA:system.auth.ssh.event} %{DATA:system.auth.ssh.method} for (invalid user )?%{DATA:user.name} from %{IPORHOST:source.ip} port %{NUMBER:source.port:long} ssh2(: %{GREEDYDATA:system.auth.ssh.signature})?",
"%{SYSLOGTIMESTAMP:system.auth.timestamp} %{SYSLOGHOST:host.hostname} %{DATA:process.name}(?:\\[%{POSINT:process.pid:long}\\])?: %{DATA:system.auth.ssh.event} user %{DATA:user.name} from %{IPORHOST:source.ip}",
"%{SYSLOGTIMESTAMP:system.auth.timestamp} %{SYSLOGHOST:host.hostname} %{DATA:process.name}(?:\\[%{POSINT:process.pid:long}\\])?: Did not receive identification string from %{IPORHOST:system.auth.ssh.dropped_ip}",
"%{SYSLOGTIMESTAMP:system.auth.timestamp} %{SYSLOGHOST:host.hostname} %{DATA:process.name}(?:\\[%{POSINT:process.pid:long}\\])?: \\s*%{DATA:user.name} :( %{DATA:system.auth.sudo.error} ;)? TTY=%{DATA:system.auth.sudo.tty} ; PWD=%{DATA:system.auth.sudo.pwd} ; USER=%{DATA:system.auth.sudo.user} ; COMMAND=%{GREEDYDATA:system.auth.sudo.command}",
"%{SYSLOGTIMESTAMP:system.auth.timestamp} %{SYSLOGHOST:host.hostname} %{DATA:process.name}(?:\\[%{POSINT:process.pid:long}\\])?: new group: name=%{DATA:group.name}, GID=%{NUMBER:group.id}",
"%{SYSLOGTIMESTAMP:system.auth.timestamp} %{SYSLOGHOST:host.hostname} %{DATA:process.name}(?:\\[%{POSINT:process.pid:long}\\])?: new user: name=%{DATA:user.name}, UID=%{NUMBER:user.id}, GID=%{NUMBER:group.id}, home=%{DATA:system.auth.useradd.home}, shell=%{DATA:system.auth.useradd.shell}$",
"%{SYSLOGTIMESTAMP:system.auth.timestamp} %{SYSLOGHOST:host.hostname}? %{DATA:process.name}(?:\\[%{POSINT:process.pid:long}\\])?: %{GREEDYMULTILINE:system.auth.message}"
"%{TIMESTAMP:system.auth.timestamp} %{SYSLOGHOST:host.hostname} %{DATA:process.name}(?:\\[%{POSINT:process.pid:long}\\])?: %{DATA:system.auth.ssh.event} %{DATA:system.auth.ssh.method} for (invalid user )?%{DATA:user.name} from %{IPORHOST:source.ip} port %{NUMBER:source.port:long} ssh2(: %{GREEDYDATA:system.auth.ssh.signature})?",
"%{TIMESTAMP:system.auth.timestamp} %{SYSLOGHOST:host.hostname} %{DATA:process.name}(?:\\[%{POSINT:process.pid:long}\\])?: %{DATA:system.auth.ssh.event} user %{DATA:user.name} from %{IPORHOST:source.ip}",
"%{TIMESTAMP:system.auth.timestamp} %{SYSLOGHOST:host.hostname} %{DATA:process.name}(?:\\[%{POSINT:process.pid:long}\\])?: Did not receive identification string from %{IPORHOST:system.auth.ssh.dropped_ip}",
"%{TIMESTAMP:system.auth.timestamp} %{SYSLOGHOST:host.hostname} %{DATA:process.name}(?:\\[%{POSINT:process.pid:long}\\])?: \\s*%{DATA:user.name} :( %{DATA:system.auth.sudo.error} ;)? TTY=%{DATA:system.auth.sudo.tty} ; PWD=%{DATA:system.auth.sudo.pwd} ; USER=%{DATA:system.auth.sudo.user} ; COMMAND=%{GREEDYDATA:system.auth.sudo.command}",
"%{TIMESTAMP:system.auth.timestamp} %{SYSLOGHOST:host.hostname} %{DATA:process.name}(?:\\[%{POSINT:process.pid:long}\\])?: new group: name=%{DATA:group.name}, GID=%{NUMBER:group.id}",
"%{TIMESTAMP:system.auth.timestamp} %{SYSLOGHOST:host.hostname} %{DATA:process.name}(?:\\[%{POSINT:process.pid:long}\\])?: new user: name=%{DATA:user.name}, UID=%{NUMBER:user.id}, GID=%{NUMBER:group.id}, home=%{DATA:system.auth.useradd.home}, shell=%{DATA:system.auth.useradd.shell}$",
"%{TIMESTAMP:system.auth.timestamp} %{SYSLOGHOST:host.hostname}? %{DATA:process.name}(?:\\[%{POSINT:process.pid:long}\\])?: %{GREEDYMULTILINE:system.auth.message}"
]
}
},
Expand Down Expand Up @@ -44,7 +45,8 @@
"target_field": "@timestamp",
"formats": [
"MMM d HH:mm:ss",
"MMM dd HH:mm:ss"
"MMM dd HH:mm:ss",
"ISO8601"
],
{< if .convert_timezone >}"timezone": "{{ event.timezone }}",{< end >}
"ignore_failure": true
Expand Down
2 changes: 2 additions & 0 deletions filebeat/module/system/auth/test/timestamp.log
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
2019-06-14T10:40:20.912134 localhost sudo: pam_unix(sudo-i:session): session opened for user root by userauth3(uid=0)
2019-06-14T13:01:15.412+01:30 localhost pam: user nobody logged out.
30 changes: 30 additions & 0 deletions filebeat/module/system/auth/test/timestamp.log-expected.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
[
{
"@timestamp": "2019-06-14T10:40:20.912Z",
"event.dataset": "system.auth",
"event.module": "system",
"event.timezone": "+00:00",
"fileset.name": "auth",
"host.hostname": "localhost",
"input.type": "log",
"log.file.path": "timestamp.log",
"log.offset": 0,
"message": "pam_unix(sudo-i:session): session opened for user root by userauth3(uid=0)",
"process.name": "sudo",
"service.type": "system"
},
{
"@timestamp": "2019-06-14T11:31:15.412Z",
"event.dataset": "system.auth",
"event.module": "system",
"event.timezone": "+00:00",
"fileset.name": "auth",
"host.hostname": "localhost",
"input.type": "log",
"log.file.path": "timestamp.log",
"log.offset": 118,
"message": "user nobody logged out.",
"process.name": "pam",
"service.type": "system"
}
]
32 changes: 16 additions & 16 deletions filebeat/tests/system/test_modules.py
Original file line number Diff line number Diff line change
Expand Up @@ -216,6 +216,13 @@ def clean_keys(obj):
time_keys = ["event.created"]
# source path and agent.version can be different for each run
other_keys = ["log.file.path", "agent.version"]
# datasets for which @timestamp is removed due to date missing
remove_timestamp = {"icinga.startup", "redis.log", "haproxy.log", "system.auth", "system.syslog"}
# dataset + log file pairs for which @timestamp is kept as an exception from above
remove_timestamp_exception = {
('system.syslog', 'tz-offset.log'),
('system.auth', 'timestamp.log')
}

# Keep source log filename for exceptions
filename = None
Expand All @@ -225,23 +232,16 @@ def clean_keys(obj):
for key in host_keys + time_keys + other_keys:
delete_key(obj, key)

# Remove timestamp for comparison where timestamp is not part of the log line
if (obj["event.dataset"] in ["icinga.startup", "redis.log", "haproxy.log", "system.auth"]):
delete_key(obj, "@timestamp")

# HACK: This keeps @timestamp for the tz-offset.log in system.syslog.
#
# This can't be done for all syslog logs because most of them lack the year
# in their timestamp, so Elasticsearch will set it to the current year and
# that will cause the tests to fail every new year.
#
# The log.file.path key needs to be kept so that it is stored in the golden
# data, to prevent @timestamp to be removed from it before comparison.
if obj["event.dataset"] == "system.syslog":
if filename == "tz-offset.log":
obj["log.file.path"] = filename
else:
# Most logs from syslog need their timestamp removed because it doesn't
# include a year.
if obj["event.dataset"] in remove_timestamp:
if not (obj['event.dataset'], filename) in remove_timestamp_exception:
delete_key(obj, "@timestamp")
else:
# excluded events need to have their filename saved to the expected.json
# so that the exception mechanism can be triggered when the json is
# loaded.
obj["log.file.path"] = filename


def delete_key(obj, key):
Expand Down

0 comments on commit 820c132

Please sign in to comment.