You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
Logstash pushing logs to opensearch with uneven batch size.
I am using default pipeline.batch.size and pipeline.worker.
output configuration:
output {
if "_jsonparsefailure" not in [tags] {
opensearch {
hosts => ["https://admin:[email protected]:9200"]
index => "uat-app-log-%{+YYYY.MM.dd}"
user => "xxx"
password => "xxx"
ssl => true
ssl_certificate_verification => false
sniffing => true
sniffing_delay => 30
sniffing_path => "/_nodes/data:true"
target_bulk_bytes => 20000000
}
}
}
Debug Logs:
[2024-10-21T05:31:37,063][DEBUG][logstash.outputs.opensearch][main][e67881b8b7c301e7d63b09960cb80d7322ea5a5020e63ce9780f628394358289] Sending final bulk request for batch. {:action_count=>2, :payload_size=>2410, :content_length=>2410, :batch_offset=>0}
[2024-10-21T05:31:37,070][DEBUG][logstash.outputs.opensearch][main][b929b8d03a3787f0ae738eff4a4e20aa5b328a50c85b4b975d4f0449a0e44bdb] Sending final bulk request for batch. {:action_count=>98, :payload_size=>136636, :content_length=>136636, :batch_offset=>0}
[2024-10-21T05:31:37,057][DEBUG][logstash.outputs.opensearch][main][b929b8d03a3787f0ae738eff4a4e20aa5b328a50c85b4b975d4f0449a0e44bdb] Sending final bulk request for batch. {:action_count=>74, :payload_size=>103739, :content_length=>103739, :batch_offset=>0}
[2024-10-21T05:31:37,061][DEBUG][logstash.outputs.opensearch][main][b929b8d03a3787f0ae738eff4a4e20aa5b328a50c85b4b975d4f0449a0e44bdb] Sending final bulk request for batch. {:action_count=>67, :payload_size=>93933, :content_length=>93933, :batch_offset=>0}
[2024-10-21T05:31:37,098][DEBUG][logstash.outputs.opensearch][main][480ab6bfbcda240d793d62512345df39027250bde58c8527358f92c58c6a4afb] Sending final bulk request for batch. {:action_count=>14, :payload_size=>14581, :content_length=>14581, :batch_offset=>0}
[2024-10-21T05:31:37,074][DEBUG][logstash.outputs.opensearch][main][480ab6bfbcda240d793d62512345df39027250bde58c8527358f92c58c6a4afb] Sending final bulk request for batch. {:action_count=>93, :payload_size=>138313, :content_length=>138313, :batch_offset=>0}
[2024-10-21T05:31:37,071][DEBUG][logstash.outputs.opensearch][main][480ab6bfbcda240d793d62512345df39027250bde58c8527358f92c58c6a4afb] Sending final bulk request for batch. {:action_count=>7, :payload_size=>8302, :content_length=>8302, :batch_offset=>0}
[2024-10-21T05:31:37,071][DEBUG][logstash.outputs.opensearch][main][480ab6bfbcda240d793d62512345df39027250bde58c8527358f92c58c6a4afb] Sending final bulk request for batch. {:action_count=>8, :payload_size=>9847, :content_length=>9847, :batch_offset=>0}
[2024-10-21T05:31:37,074][DEBUG][logstash.outputs.opensearch][main][480ab6bfbcda240d793d62512345df39027250bde58c8527358f92c58c6a4afb] Sending final bulk request for batch. {:action_count=>125, :payload_size=>165378, :content_length=>165378, :batch_offset=>0}
[2024-10-21T05:31:37,093][DEBUG][logstash.outputs.opensearch][main][480ab6bfbcda240d793d62512345df39027250bde58c8527358f92c58c6a4afb] Sending final bulk request for batch. {:action_count=>51, :payload_size=>68384, :content_length=>68384, :batch_offset=>0}
[2024-10-21T05:31:37,098][DEBUG][logstash.outputs.opensearch][main][480ab6bfbcda240d793d62512345df39027250bde58c8527358f92c58c6a4afb] Sending final bulk request for batch. {:action_count=>58, :payload_size=>77009, :content_length=>77009, :batch_offset=>0}
[2024-10-21T05:31:37,103][DEBUG][logstash.outputs.opensearch][main][480ab6bfbcda240d793d62512345df39027250bde58c8527358f92c58c6a4afb] Sending final bulk request for batch. {:action_count=>125, :payload_size=>165540, :content_length=>165540, :batch_offset=>0}
[2024-10-21T05:31:37,116][DEBUG][logstash.outputs.opensearch][main][b929b8d03a3787f0ae738eff4a4e20aa5b328a50c85b4b975d4f0449a0e44bdb] Sending final bulk request for batch. {:action_count=>125, :payload_size=>175130, :content_length=>175130, :batch_offset=>0}
[2024-10-21T05:31:37,063][DEBUG][logstash.outputs.opensearch][main][b929b8d03a3787f0ae738eff4a4e20aa5b328a50c85b4b975d4f0449a0e44bdb] Sending final bulk request for batch. {:action_count=>44, :payload_size=>61732, :content_length=>61732, :batch_offset=>0}
[2024-10-21T05:31:37,113][DEBUG][logstash.outputs.opensearch][main][480ab6bfbcda240d793d62512345df39027250bde58c8527358f92c58c6a4afb] Sending final bulk request for batch. {:action_count=>125, :payload_size=>165220, :content_length=>165220, :batch_offset=>0}
Host/Environment (please complete the following information):
EKS version 1.30
The text was updated successfully, but these errors were encountered:
I bet there's some time-based criteria in addition to batch size. Care to dig into the implementation to see how it's deciding to send the data? Documenting it would be a great start.
Describe the bug
Logstash pushing logs to opensearch with uneven batch size.
I am using default pipeline.batch.size and pipeline.worker.
output configuration:
Debug Logs:
Host/Environment (please complete the following information):
The text was updated successfully, but these errors were encountered: