-
Notifications
You must be signed in to change notification settings - Fork 101
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
logstash repeatedly processes a corrupted compressed file in and endless loop #261
Comments
Hi @xo4n I wonder which data Logstash loads from a corrupted file. I mean if the gzip file is corrupted you aren't able to decompress it, so no content should be load from it. I think that probably using the tail mode could help to avoid continue failing, but do you have a shareable corrupted gzip that manifest the problem? |
Thanks @xo4n for sharing the file, I was able to reproduce locally. The problem is that when the reading of a line from the gziped input stream
it's then executed only the logstash-input-file/lib/filewatch/read_mode/handlers/read_zip_file.rb Lines 43 to 49 in e9ed605
that essentially closes the input resources but doesn't delete the file, nor mark it as processes, so the next discovery loop find it again, process, creating the same events as done before, and so on. There are two possible strategies to solve this:
|
…y check on archives, close logstash-plugins#261
solved with version |
We had a case where several corrupted compressed files ended up in the input directory of the logstash pipeline. What happens after is that logstash reads and processes some of the lines of the corrupted files, sends some corrupted data to the output, and throws an error before finishing reading.
Because it didn't finish properly, the file is not marked as processed and it is picked up again over and over. The result is a continuous stream of corrupted data being sent to the output.
The input configuration
Tried with 2 different versions ( 7.5.2 and 7.1.1 ) and the issue was reproduceable in both cases
The text was updated successfully, but these errors were encountered: