-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Json parsing of dockerized logs is not working properly #617
Comments
This how mongodb logs look like: {"log":"2018-06-05T07:35:08.730+0000 I CONTROL [initandlisten] ** WARNING: /sys/kernel/mm/transparent_hugepage/defrag is 'always'.\n","stream":"stdout","time":"2018-06-05T07:35:08.73067826Z"} This means the default parser you supplied wont work properly. |
I tried this parser: But it not working :( |
Also tried to add a parser filter but it's not working.
|
I had the same problem. I'm now able to use the following parser, but I needed to use 0.13.2. The escaped_utf8 decoder wasn't in 0.13.0.
|
@kyleroot I'm using 0.13.2 but my problem isn't the utf8, it's the fact that i have an unscaped json inside the log variable and I cant get this to work. I had to send it via http to logstash and extract the json from there. |
You're right, there are some logs I have that json within the already json docker logs. The inner json is not being unescaped correctly. This is still a problem in 0.13.4. |
As far as I understood issue is still present in 0.14.7? |
Please check the following comment on #1278 : |
Issue already fixed, ref: #1278 (comment) |
Hey,
I have an app that writes json to the stdout. Docker saves the logs like this:
{"log":"{"time":"14:09:26","rel_time":7197,"loglevel":"INFO","context":"http","module":"io.ssl_ctx","event_tag":"","message":"loading cert cache dir","dir":"\u002Fopt\u002Fvaultive\u002Fvar\u002Fssl"}\n","stream":"stdout","time":"2018-06-05T14:09:26.572091825Z"}
I use the json parser on this input. I then tried to apply the parser filter to parse as json the log field but It wont work since the data isnt proper json (docker changed the encoding to a json inside json). How can I make this work and get my json fields again? Do you have some sort of pre parser json encoding filter?
Thanks.
I'm using the default docker parser from the examples.
The text was updated successfully, but these errors were encountered: