Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[receiver/filelog]High memory consumption of collector and dropped logs #31129

Closed
marcinsiennicki95 opened this issue Feb 8, 2024 · 14 comments
Closed
Labels
bug Something isn't working needs triage New item requiring triage receiver/filelog

Comments

@marcinsiennicki95
Copy link

marcinsiennicki95 commented Feb 8, 2024

Component(s)

receiver/filelog

What happened?

Description

I am currently configuring the OpenTelemetry Collector to gather logs through the FileLog Receiver, which are then transmitted to Fluent Bit via the OpenTelemetry exporter. Fluent Bit ingests these logs through the OpenTelemetry Input and forwards them using OpenTelemetry Output. In this configuration, OpenTelemetry acts as a proxy for Fluent Bit, allowing it to collect and process data from other various telemetry sources

Steps to Reproduce

  1. Set up the environment with installed FluentBit and OpenTelemetry Collector
  2. Configure and start the OpenTelemetry Collector to gather logs from the Test file using the Filelog receiver, and sent the logs using the otlphttp exporter and not consume more memory than limit
  3. Configure and run FluentBit to receive logs using the OpenTelemetry Input and sent logs using the OpenTelemetry Output to destination
  4. Upload the prepared Test file which has size ~500mb/1mln logs to the configured folder simulating high load
  5. Monitor the environment and validate the results in CloudPlatform

Expected Result

Each of the logs should be processed. No error logs should visible on console

Actual Result

Without memory-limiting processor, the number of logs displayed in the user interface does not correspond exactly to 1 million and service consume a lot of memory

2024-02-08T11:58:49.791Z        error   exporterhelper/common.go:174    Exporting failed. Rejecting data.       {"kind": "exporter", "data_type": "logs", "name": "otlphttp", "error": "sending queue is full", "rejected_items": 100}
go.opentelemetry.io/collector/exporter/exporterhelper.(*baseExporter).send
        go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:174
go.opentelemetry.io/collector/exporter/exporterhelper.NewLogsExporter.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/logs.go:98
go.opentelemetry.io/collector/consumer.ConsumeLogsFunc.ConsumeLogs
        go.opentelemetry.io/collector/[email protected]/logs.go:25
go.opentelemetry.io/collector/internal/fanoutconsumer.(*logsConsumer).ConsumeLogs
        go.opentelemetry.io/[email protected]/internal/fanoutconsumer/logs.go:73
go.opentelemetry.io/collector/consumer.ConsumeLogsFunc.ConsumeLogs
        go.opentelemetry.io/collector/[email protected]/logs.go:25
github.com/open-telemetry/opentelemetry-collector-contrib/internal/coreinternal/consumerretry.(*logsConsumer).ConsumeLogs
        github.com/open-telemetry/opentelemetry-collector-contrib/internal/[email protected]/consumerretry/logs.go:66
github.com/open-telemetry/opentelemetry-collector-contrib/pkg/stanza/adapter.(*receiver).consumerLoop
        github.com/open-telemetry/opentelemetry-collector-contrib/pkg/[email protected]/adapter/receiver.go:125

With memory limiter processor.

Collector version

v.0.93

Environment information

WindowsServer:2019-Datacenter:latest

OpenTelemetry Collector configuration

receivers:
  filelog:
    include: [ C:/test ]
    encoding: utf-16le
    include_file_path: true
    max_log_size: 1MiB
    storage: file_storage
    poll_interval: 10s
    retry_on_failure:
        enabled: true

exporters:
  logging:

  otlphttp:
    endpoint: http://127.0.0.1:4318
    tls:
      insecure: true

extensions:
  file_storage:
    directory: C:/collector
    fsync: true

service:
  extensions: [file_storage]
  pipelines:
    logs:
      receivers: [filelog]
      exporters: [logging, otlphttp]

or added 

processors:
  memory_limiter:
    check_interval: 1s
    limit_mib: 120
    spike_limit_mib: 80

Log output

No response

Additional context

Log output with memory limiter.
2024-02-08 11_34_32-default-windows-chef-17 - 10 6 106 10_3389 - Remote Desktop Connection

Actual
OpenTelemetry Collector sends only a limited number of logs before reaching a memory limit, which starts the garbage collection (GC) process and stops further log processing. As a result, about 950,000 logs are not send. Only new generated logs are sent . Ie, Fluent Bit, when the memory limit is reached, pauses log entry and after a short pause resumes processing without losing logs

FluentBit Configuration.

[INPUT]
    name opentelemetry
    tag_from_uri false
    tag test
    listen 127.0.0.1
    port 4318

    Mem_Buf_Limit 50MB

Does the OpenTelemetry Collector offer a parameter analogous to fleuntbit mem_buf_limit for managing memory buffer constraints?

@marcinsiennicki95 marcinsiennicki95 added bug Something isn't working needs triage New item requiring triage labels Feb 8, 2024
Copy link
Contributor

github-actions bot commented Feb 8, 2024

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@djaglowski
Copy link
Member

At a glance this appears to be a back pressure problem but it's unclear where. Have you tried using the exporter's retry & queueing functionality to manage this? Example

@djaglowski
Copy link
Member

cc: @dmitryax

@marcinsiennicki95
Copy link
Author

marcinsiennicki95 commented Feb 9, 2024

@djaglowski Thanks for your reply. I tested your suggestion and it didn't help

I do not think I have described the test scenario in detail

OpenTelemetry Collector configuration

receivers:
  filelog:
    include: [ C:/test]
    encoding: utf-16le
    include_file_path: true
    max_log_size: 1MiB
    storage: file_storage
    poll_interval: 10s
    retry_on_failure:
        enabled: true

exporters:
  logging:

  otlphttp:
    endpoint: http://127.0.0.1:4318
    tls:
      insecure: true

extensions:
  file_storage:
    directory: C:/collector
    fsync: true

processors:
  batch:
    send_batch_size: 1000
  memory_limiter:
    check_interval: 1s
    limit_mib: 250
    spike_limit_mib: 100

service:
  extensions: [file_storage]
  pipelines:
    logs:
      receivers: [filelog]
      processors: [memory_limiter,batch]
      exporters: [logging, otlphttp]

Log output

PS C:\Users\kCuraCloudAdmin> C:\\collector\\otelcol-contrib_0.93.0_windows_amd64\\otelcol-contrib.exe --config C:\\collector\\otelcol-contrib_0.93.0_windows_amd64\\config.yaml

2024-02-09T13:04:50.085Z        info    memorylimiter/memorylimiter.go:77       Memory limiter configured       {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "limit_mib": 250, "spike_limit_mib": 150, "check_interval": 1}
2024-02-09T13:04:50.086Z        info    [email protected]/service.go:139  Starting otelcol-contrib...     {"Version": "0.93.0", "NumCPU": 2}
2024-02-09T13:04:50.118Z        warn    fileconsumer/file.go:51 finding files: no files match the configured criteria   {"kind": "receiver", "name": "filelog", "data_type": "logs", "component": "fileconsumer"}
2024-02-09T13:05:10.298Z        info    fileconsumer/file.go:268        Started watching file   {"kind": "receiver", "name": "filelog", "data_type": "logs", "component": "fileconsumer", "path": "C:\\test"}
2024-02-09T13:05:10.303Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-09T13:05:10.304Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-09T13:05:11.227Z        info    memorylimiter/memorylimiter.go:222      Memory usage is above soft limit. Forcing a GC. {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "cur_mem_mib": 112}
2024-02-09T13:05:11.310Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-09T13:05:11.361Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-09T13:05:12.093Z        info    memorylimiter/memorylimiter.go:192      Memory usage after GC.  {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "cur_mem_mib": 194}
2024-02-09T13:05:12.107Z        warn    memorylimiter/memorylimiter.go:229      Memory usage is above soft limit. Refusing data.        {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "cur_mem_mib": 194}
2024-02-09T13:05:12.456Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 9, "log records": 819}
2024-02-09T13:07:13.100Z        info    memorylimiter/memorylimiter.go:215      Memory usage back within limits. Resuming normal operation.     {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "cur_mem_mib": 36}
2024-02-09T13:07:34.312Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-09T13:07:34.312Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-09T13:07:35.222Z        info    memorylimiter/memorylimiter.go:222      Memory usage is above soft limit. Forcing a GC. {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "cur_mem_mib": 166}
2024-02-09T13:07:35.328Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}

2024-02-09T13:07:35.430Z        info    memorylimiter/memorylimiter.go:192      Memory usage after GC.  {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "cur_mem_mib": 99}
2024-02-09T13:07:36.168Z        warn    memorylimiter/memorylimiter.go:229      Memory usage is above soft limit. Refusing data.        {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "cur_mem_mib": 211}
2024-02-09T13:07:36.466Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 4, "log records": 400}
2024-02-09T13:09:37.102Z        info    memorylimiter/memorylimiter.go:215      Memory usage back within limits. Resuming normal operation.     {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "cur_mem_mib": 36}
2024-02-09T13:10:14.145Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-09T13:10:14.155Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-09T13:10:15.150Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-09T13:10:15.158Z        info    memorylimiter/memorylimiter.go:222      Memory usage is above soft limit. Forcing a GC. {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "cur_mem_mib": 152}
2024-02-09T13:10:15.177Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-09T13:10:15.433Z        info    memorylimiter/memorylimiter.go:192      Memory usage after GC.  {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "cur_mem_mib": 110}
2024-02-09T13:10:15.624Z        warn    memorylimiter/memorylimiter.go:229      Memory usage is above soft limit. Refusing data.        {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "cur_mem_mib": 110}
2024-02-09T13:12:16.100Z        info    memorylimiter/memorylimiter.go:215      Memory usage back within limits. Resuming normal operation.     {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "cur_mem_mib": 35}
2024-02-09T13:12:27.063Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-09T13:12:27.073Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-09T13:12:28.087Z        info    memorylimiter/memorylimiter.go:222      Memory usage is above soft limit. Forcing a GC. {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "cur_mem_mib": 152}
2024-02-09T13:12:28.929Z        info    memorylimiter/memorylimiter.go:192      Memory usage after GC.  {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "cur_mem_mib": 188}
2024-02-09T13:12:28.949Z        warn    memorylimiter/memorylimiter.go:229      Memory usage is above soft limit. Refusing data.        {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "cur_mem_mib": 188}
2024-02-09T13:12:29.144Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 6, "log records": 600}
2024-02-09T13:14:29.099Z        info    memorylimiter/memorylimiter.go:215      Memory usage back within limits. Resuming normal operation.     {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "cur_mem_mib": 35}
2024-02-09T13:14:35.501Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 2, "log records": 37}

Result
Processed only 889k logs. Upon reaching the soft limit, the receiver pauses for approximately 120 seconds before resuming to process the remainder of the file. It appears that some logs are not being returned to the receiver but are instead being dropped. According to the documentation, the logs should be returned to the receiver for reprocessing

@marcinsiennicki95
Copy link
Author

marcinsiennicki95 commented Feb 9, 2024

Here is second scenario where the collector is now working as expected

OpenTelemetry Collector configuration

  filelog:
    include: [ C:/test ]
    encoding: utf-16le
    include_file_path: true
    max_log_size: 1MiB
    storage: file_storage
    poll_interval: 10s
    retry_on_failure:
        enabled: true
  filelog/continues:
    include: [ C:/test1.txt]
    include_file_path: true
    max_log_size: 1MiB
    storage: file_storage
    poll_interval: 10s
    retry_on_failure:
        enabled: true

exporters:
  logging:

  otlphttp:
    endpoint: http://127.0.0.1:4318
    tls:
      insecure: true

extensions:
  file_storage:
    directory: C:/collector
    fsync: true

processors:
  batch:
    send_batch_size: 1000
  memory_limiter:
    check_interval: 1s
    limit_mib: 250
    spike_limit_mib: 100

service:
  extensions: [file_storage]
  pipelines:
    logs:
      receivers: [filelog, filelog/continues]
      processors: [memory_limiter,batch]
      exporters: [logging, otlphttp]

I used this script to generate a continuous flow of logs to a file

while ($true) {
    "Test message - $(Get-Date)" | Out-File -FilePath "C:\test1.txt" -Append
     Start-Sleep -Seconds 2
}

Log output

2024-02-09T14:08:39.208Z        info    adapter/receiver.go:45  Starting stanza receiver        {"kind": "receiver", "name": "filelog", "data_type": "logs"}
2024-02-09T14:08:39.284Z        warn    fileconsumer/file.go:51 finding files: no files match the configured criteria   {"kind": "receiver", "name": "filelog", "data_type": "logs", "component": "fileconsumer"}
2024-02-09T14:08:39.290Z        info    adapter/receiver.go:45  Starting stanza receiver        {"kind": "receiver", "name": "filelog/continues", "data_type": "logs"}
2024-02-09T14:08:39.315Z        info    [email protected]/service.go:165  Everything is ready. Begin running and processing data.
2024-02-09T14:08:49.330Z        info    fileconsumer/file.go:268        Started watching file   {"kind": "receiver", "name": "filelog/continues", "data_type": "logs", "component": "fileconsumer", "path": "C:\\test.txt"}
2024-02-09T14:08:59.495Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 1, "log records": 5}
2024-02-09T14:09:09.505Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 1, "log records": 5}
2024-02-09T14:14:09.463Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 1, "log records": 5}
2024-02-09T14:14:19.336Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 1, "log records": 5}
2024-02-09T14:14:19.420Z        info    fileconsumer/file.go:268        Started watching file   {"kind": "receiver", "name": "filelog", "data_type": "logs", "component": "fileconsumer", "path": "C:\\ERRORLOG"}
2024-02-09T14:14:19.426Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-09T14:14:20.337Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-09T14:14:20.368Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-09T14:14:21.211Z        info    memorylimiter/memorylimiter.go:222      Memory usage is above soft limit. Forcing a GC. {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "cur_mem_mib": 167}
2024-02-09T14:14:21.359Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-09T14:14:21.453Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-09T14:14:21.847Z        info    memorylimiter/memorylimiter.go:192      Memory usage after GC.  {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "cur_mem_mib": 178}
2024-02-09T14:14:21.953Z        warn    memorylimiter/memorylimiter.go:229      Memory usage is above soft limit. Refusing data.        {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "cur_mem_mib": 178}
2024-02-09T14:14:24.212Z        warn    memorylimiter/memorylimiter.go:203      Memory usage is above hard limit. Forcing a GC. {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "cur_mem_mib": 265}
2024-02-09T14:14:24.453Z        info    memorylimiter/memorylimiter.go:192      Memory usage after GC.  {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "cur_mem_mib": 76}
2024-02-09T14:14:24.463Z        info    memorylimiter/memorylimiter.go:215      Memory usage back within limits. Resuming normal operation.     {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "cur_mem_mib": 76}
2024-02-09T14:14:26.020Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-09T14:14:26.025Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-09T14:14:27.034Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-09T14:14:27.121Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-09T14:14:27.457Z        warn    memorylimiter/memorylimiter.go:229      Memory usage is above soft limit. Refusing data.        {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "cur_mem_mib": 205}
2024-02-09T14:14:29.214Z        info    memorylimiter/memorylimiter.go:215      Memory usage back within limits. Resuming normal operation.     {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "cur_mem_mib": 136}
2024-02-09T14:14:29.341Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 1, "log records": 5}
2024-02-09T14:14:30.215Z        warn    memorylimiter/memorylimiter.go:229      Memory usage is above soft limit. Refusing data.        {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "cur_mem_mib": 151}
2024-02-09T14:16:29.222Z        info    memorylimiter/memorylimiter.go:215      Memory usage back within limits. Resuming normal operation.     {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "cur_mem_mib": 35}
2024-02-09T14:16:29.779Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 6, "log records": 60}
2024-02-09T14:16:38.800Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 11, "log records": 1044}
2024-02-09T14:16:38.810Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-09T14:16:39.811Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-09T14:16:39.826Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-09T14:16:40.238Z        info    memorylimiter/memorylimiter.go:222      Memory usage is above soft limit. Forcing a GC. {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "cur_mem_mib": 182}
2024-02-09T14:16:40.816Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-09T14:16:40.863Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-09T14:16:40.869Z        info    memorylimiter/memorylimiter.go:192      Memory usage after GC.  {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "cur_mem_mib": 125}
2024-02-09T14:16:41.231Z        warn    memorylimiter/memorylimiter.go:229      Memory usage is above soft limit. Refusing data.        {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "cur_mem_mib": 193}
2024-02-09T14:18:42.212Z        info    memorylimiter/memorylimiter.go:215      Memory usage back within limits. Resuming normal operation.     {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "cur_mem_mib": 35}
2024-02-09T14:18:53.290Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-09T14:18:53.290Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-09T14:18:55.222Z        info    memorylimiter/memorylimiter.go:222      Memory usage is above soft limit. Forcing a GC. {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "cur_mem_mib": 156}
2024-02-09T14:18:55.282Z        info    memorylimiter/memorylimiter.go:192      Memory usage after GC.  {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "cur_mem_mib": 42}
2024-02-09T14:18:58.586Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 6, "log records": 65}
2024-02-09T14:19:29.419Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 1, "log records": 4}
2024-02-09T14:19:30.314Z        info    [email protected]/collector.go:258        Received signal from OS {"signal": "interrupt"}
2024-02-09T14:19:30.314Z        info    [email protected]/service.go:179  Starting shutdown...
2024-02-09T14:19:30.319Z        info    adapter/receiver.go:140 Stopping stanza receiver        {"kind": "receiver", "name": "filelog/continues", "data_type": "logs"}
2024-02-09T14:19:30.364Z        info    adapter/receiver.go:140 Stopping stanza receiver        {"kind": "receiver", "name": "filelog", "data_type": "logs"}
2024-02-09T14:19:30.383Z        info    extensions/extensions.go:59     Stopping extensions...
2024-02-09T14:19:30.383Z        info    [email protected]/service.go:193  Shutdown complete.

Result
Initially, OpenTelemetry Collector successfully collected logs form filelog/continues reciver. However, after handling a file with a large number of logs, the collector reached the soft limit, both receivers experienced problems. I anticipate that this may be affecting the entire log pipeline. In my next test, I will check if other telemetry pipelines (traces/metrics) will be effected by reaching the soft limit. After testing the memory-limiting processor in the log pipeline, along with an additional metrics pipeline but without memory limiter , it was determined that the processor in log pipeline had no effect on the metrics pipeline. Metrics were still transmitted accurately and without a 120-second delay

From my experience with Fluent Bit, a more effective approach to monitor memory consumption involves checking it as early as possible to minimize backpressure. I assume that with proper configuration, receivers in the Collector should not impact one another

@marcinsiennicki95 marcinsiennicki95 changed the title High memory consumption of collector and dropped logs [receiver/filelog]High memory consumption of collector and dropped logs Feb 9, 2024
@marcinsiennicki95
Copy link
Author

marcinsiennicki95 commented Feb 9, 2024

To me, it is not clear why the pipeline is paused for some time, even if this message is displayed

2024-02-09T13:10:15.177Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-09T13:10:15.433Z        info    memorylimiter/memorylimiter.go:192      Memory usage after GC.  {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "cur_mem_mib": 110}
2024-02-09T13:10:15.624Z        warn    memorylimiter/memorylimiter.go:229      Memory usage is above soft limit. Refusing data.        {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "cur_mem_mib": 110}
2024-02-09T13:12:16.100Z        info    memorylimiter/memorylimiter.go:215      Memory usage back within limits. Resuming normal operation.     {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "cur_mem_mib": 35}
2024-02-09T13:12:27.063Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}

I assumed that it should resume immediately after recover to below soft memory limit

@marcinsiennicki95
Copy link
Author

marcinsiennicki95 commented Feb 12, 2024

Another test, Handling High Volumes of Logs and Metrics with the Memory Limiter Processor in pipelines

OpenTelemetry Collector configuration

receivers:
 filelog:
   include: [ C:/test ]
   encoding: utf-16le
   include_file_path: true
   max_log_size: 1MiB
   storage: file_storage
   poll_interval: 10s
   retry_on_failure:
       enabled: true
 
 hostmetrics:
   collection_interval: 2s
   scrapers:
     cpu:
     memory:

exporters:
 logging:

 otlphttp:
   endpoint: http://127.0.0.1:4318
   tls:
     insecure: true

extensions:
 file_storage:
   directory: C:/collector
   fsync: true

processors:
 batch:
   send_batch_size: 1000
 memory_limiter:
   check_interval: 1s
   limit_mib: 250
   spike_limit_mib: 100

service:
 extensions: [file_storage]
 pipelines:
   logs:
     receivers: [filelog]
     processors: [memory_limiter,batch]
     exporters: [logging, otlphttp]
   metrics:
     receivers: [hostmetrics]
     processors: [memory_limiter,batch]
     exporters: [logging, otlphttp]

Log output

PS C:\Users\kCuraCloudAdmin> C:\\collector\\otelcol-contrib_0.93.0_windows_amd64\\otelcol-contrib.exe --config C:\\collector\\otelcol-contrib_0.93.0_windows_amd64\\config.yaml
2024-02-12T10:10:54.921Z        info    [email protected]/telemetry.go:76 Setting up own telemetry...
2024-02-12T10:10:54.931Z        info    [email protected]/telemetry.go:146        Serving metrics {"address": ":8888", "level": "Basic"}
2024-02-12T10:10:54.931Z        info    [email protected]/exporter.go:275        Deprecated component. Will be removed in future releases.       {"kind": "exporter", "data_type": "logs", "name": "logging"}
2024-02-12T10:10:54.933Z        info    [email protected]/exporter.go:275        Deprecated component. Will be removed in future releases.       {"kind": "exporter", "data_type": "metrics", "name": "logging"}
2024-02-12T10:10:54.933Z        info    memorylimiter/memorylimiter.go:77       Memory limiter configured       {"kind": "processor", "name": "memory_limiter", "pipeline": "metrics", "limit_mib": 250, "spike_limit_mib": 100, "check_interval": 1}
2024-02-12T10:10:54.935Z        info    [email protected]/service.go:139  Starting otelcol-contrib...     {"Version": "0.93.0", "NumCPU": 2}
2024-02-12T10:10:54.937Z        info    extensions/extensions.go:34     Starting extensions...
2024-02-12T10:10:54.938Z        info    extensions/extensions.go:37     Extension is starting...        {"kind": "extension", "name": "file_storage"}
2024-02-12T10:10:54.938Z        info    extensions/extensions.go:52     Extension started.      {"kind": "extension", "name": "file_storage"}
2024-02-12T10:10:54.939Z        info    adapter/receiver.go:45  Starting stanza receiver        {"kind": "receiver", "name": "filelog/continues", "data_type": "logs"}
2024-02-12T10:10:54.976Z        warn    fileconsumer/file.go:51 finding files: no files match the configured criteria   {"kind": "receiver", "name": "filelog/continues", "data_type": "logs", "component": "fileconsumer"}
2024-02-12T10:10:54.987Z        info    adapter/receiver.go:45  Starting stanza receiver        {"kind": "receiver", "name": "filelog", "data_type": "logs"}
2024-02-12T10:10:55.018Z        warn    fileconsumer/file.go:51 finding files: no files match the configured criteria   {"kind": "receiver", "name": "filelog", "data_type": "logs", "component": "fileconsumer"}
2024-02-12T10:10:55.033Z        info    [email protected]/service.go:165  Everything is ready. Begin running and processing data.
2024-02-12T10:10:56.197Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T10:11:14.048Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T10:11:15.160Z        info    fileconsumer/file.go:268        Started watching file   {"kind": "receiver", "name": "filelog", "data_type": "logs", "component": "fileconsumer", "path": "C:\\ERRORLOG"}
2024-02-12T10:11:15.165Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T10:11:15.165Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T10:11:16.170Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T10:11:16.177Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T10:11:16.813Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T10:11:18.275Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T10:11:18.943Z        info    memorylimiter/memorylimiter.go:222      Memory usage is above soft limit. Forcing a GC. {"kind": "processor", "name": "memory_limiter", "pipeline": "metrics", "cur_mem_mib": 156}
2024-02-12T10:11:18.994Z        info    memorylimiter/memorylimiter.go:192      Memory usage after GC.  {"kind": "processor", "name": "memory_limiter", "pipeline": "metrics", "cur_mem_mib": 37}
2024-02-12T10:11:20.156Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T10:11:22.042Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T10:14:58.120Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T10:15:00.201Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T10:15:02.045Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2

Result

Initially, OpenTelemetry Collector successfully collected the metrics. However, after processing a file containing a large number of logs (200,000), the collector encountered a soft limit on the pipeline metrics, resulting in the collection of only 120,000 logs. The situation resembled a "hard limit" scenario for logs, as the log flow was stopped and even after a 120-second pause, the logs were not retried

@marcinsiennicki95
Copy link
Author

marcinsiennicki95 commented Feb 12, 2024

Other test
Communication between 2 Instances of OpenTelemetry Collector

First OpenTelemetry Collector configuration

receivers:
  filelog:
    include: [ C:/ERRORLOG ]
    encoding: utf-16le
    include_file_path: true
    max_log_size: 1MiB
    storage: file_storage
    poll_interval: 10s
    retry_on_failure:
        enabled: true
  
  hostmetrics:
    collection_interval: 2s
    scrapers:
      cpu:
      memory:

exporters:
  logging:

  otlphttp:
    endpoint: http://127.0.0.1:4320
    tls:
      insecure: true

extensions:
  file_storage:
    directory: C:/collector
    fsync: true

processors:
  batch:
    send_batch_size: 1000
  memory_limiter:
    check_interval: 1s
    limit_mib: 250
    spike_limit_mib: 100

service:
  extensions: [file_storage]
  pipelines:
    logs:
      receivers: [filelog]
      processors: [batch]
      exporters: [logging, otlphttp]
    metrics:
      receivers: [hostmetrics]
      processors: [batch]
      exporters: [logging, otlphttp]

Log output for first Collector

PS C:\Users\kCuraCloudAdmin> C:\\collector\\otelcol-contrib_0.93.0_windows_amd64\\otelcol-contrib.exe --config C:\\collector\\otelcol-contrib_0.93.0_windows_amd64\\config.yaml
2024-02-12T11:15:59.222Z        info    [email protected]/telemetry.go:76 Setting up own telemetry...
2024-02-12T11:15:59.222Z        info    [email protected]/telemetry.go:146        Serving metrics {"address": ":8888", "level": "Basic"}
2024-02-12T11:15:59.226Z        info    [email protected]/exporter.go:275        Deprecated component. Will be removed in future releases.       {"kind": "exporter", "data_type": "logs", "name": "logging"}
2024-02-12T11:15:59.227Z        info    [email protected]/exporter.go:275        Deprecated component. Will be removed in future releases.       {"kind": "exporter", "data_type": "metrics", "name": "logging"}
2024-02-12T11:15:59.228Z        info    [email protected]/service.go:139  Starting otelcol-contrib...     {"Version": "0.93.0", "NumCPU": 2}
2024-02-12T11:15:59.229Z        info    extensions/extensions.go:34     Starting extensions...
2024-02-12T11:15:59.229Z        info    extensions/extensions.go:37     Extension is starting...        {"kind": "extension", "name": "file_storage"}
2024-02-12T11:15:59.230Z        info    extensions/extensions.go:52     Extension started.      {"kind": "extension", "name": "file_storage"}
2024-02-12T11:15:59.231Z        info    adapter/receiver.go:45  Starting stanza receiver        {"kind": "receiver", "name": "filelog", "data_type": "logs"}
2024-02-12T11:15:59.290Z        warn    fileconsumer/file.go:51 finding files: no files match the configured criteria   {"kind": "receiver", "name": "filelog", "data_type": "logs", "component": "fileconsumer"}
2024-02-12T11:15:59.296Z        info    adapter/receiver.go:45  Starting stanza receiver        {"kind": "receiver", "name": "filelog/continues", "data_type": "logs"}
2024-02-12T11:15:59.316Z        warn    fileconsumer/file.go:51 finding files: no files match the configured criteria   {"kind": "receiver", "name": "filelog/continues", "data_type": "logs", "component": "fileconsumer"}
2024-02-12T11:15:59.327Z        info    [email protected]/service.go:165  Everything is ready. Begin running and processing data.
2024-02-12T11:16:00.437Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T11:16:02.304Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", 
2024-02-12T11:16:24.299Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T11:16:26.408Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T11:16:28.293Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T11:16:29.428Z        info    fileconsumer/file.go:268        Started watching file   {"kind": "receiver", "name": "filelog", "data_type": "logs", "component": "fileconsumer", "path": "C:\\ERRORLOG"}
2024-02-12T11:16:29.432Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T11:16:32.733Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 11, "log records": 1022}
2024-02-12T11:16:33.258Z        error   scraperhelper/scrapercontroller.go:200  Error scraping metrics  {"kind": "receiver", "name": "hostmetrics", "data_type": "metrics", "error": "context deadline exceeded", "scraper": "cpu"}
go.opentelemetry.io/collector/receiver/scraperhelper.(*controller).scrapeMetricsAndReport
        go.opentelemetry.io/collector/[email protected]/scraperhelper/scrapercontroller.go:200
go.opentelemetry.io/collector/receiver/scraperhelper.(*controller).startScraping.func1
        go.opentelemetry.io/collector/[email protected]/scraperhelper/scrapercontroller.go:176
2024-02-12T11:16:33.729Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T11:16:33.814Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T11:16:34.037Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 1, "metrics": 1, "data points": 2}
2024-02-12T11:16:36.803Z        error   scraperhelper/scrapercontroller.go:200  Error scraping metrics  {"kind": "receiver", "name": "hostmetrics", "data_type": "metrics", "error": "context deadline exceeded", "scraper": "cpu"}
go.opentelemetry.io/collector/receiver/scraperhelper.(*controller).scrapeMetricsAndReport
        go.opentelemetry.io/collector/[email protected]/scraperhelper/scrapercontroller.go:200
go.opentelemetry.io/collector/receiver/scraperhelper.(*controller).startScraping.func1
        go.opentelemetry.io/collector/[email protected]/scraperhelper/scrapercontroller.go:176
2024-02-12T11:16:36.936Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 1, "metrics": 1, "data points": 2}
2024-02-12T11:16:37.676Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 26}
2024-02-12T11:16:40.065Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 11, "log records": 1045}
2024-02-12T11:16:40.277Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T11:16:40.286Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T11:16:41.083Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T11:16:41.221Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T11:16:41.649Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T11:16:49.514Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 1, "log records": 1}
2024-02-12T11:16:50.380Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}

Second OpenTelemetry Collector Configuration

receivers:  
  otlp:
    protocols:
      http:
        endpoint: "127.0.0.1:4320"

exporters:
  logging:

  otlphttp:
    endpoint: http://127.0.0.1:4318
    tls:
      insecure: true

processors:
  batch:
    send_batch_size: 1000
  memory_limiter:
    check_interval: 1s
    limit_mib: 250
    spike_limit_mib: 100

service:
  telemetry:
    metrics:
      address: "0.0.0.0:9090" # Use a port that you know is free
  pipelines:
    logs:
      receivers: [otlp]
      processors: [batch]
      exporters: [logging, otlphttp]
    metrics:
      receivers: [otlp]
      processors: [batch]
      exporters: [logging, otlphttp]

Log output for second Collector

PS C:\Users\kCuraCloudAdmin> C:\\collector\\second\\otelcol-contrib.exe --config C:\\collector\\second\\config.yaml
2024-02-12T11:15:55.914Z        info    [email protected]/telemetry.go:76 Setting up own telemetry...
2024-02-12T11:15:55.914Z        info    [email protected]/telemetry.go:146        Serving metrics {"address": "0.0.0.0:9090", "level": "Basic"}
2024-02-12T11:15:55.917Z        info    [email protected]/exporter.go:275        Deprecated component. Will be removed in future releases.       {"kind": "exporter", "data_type": "metrics", "name": "logging"}
2024-02-12T11:15:55.923Z        info    [email protected]/exporter.go:275        Deprecated component. Will be removed in future releases.       {"kind": "exporter", "data_type": "logs", "name": "logging"}
2024-02-12T11:15:55.923Z        info    [email protected]/service.go:139  Starting otelcol-contrib...     {"Version": "0.93.0", "NumCPU": 2}
2024-02-12T11:15:55.923Z        info    extensions/extensions.go:34     Starting extensions...
2024-02-12T11:15:55.923Z        info    [email protected]/otlp.go:152        Starting HTTP server    {"kind": "receiver", "name": "otlp", "data_type": "metrics", "endpoint": "127.0.0.1:4320"}
2024-02-12T11:15:55.923Z        info    [email protected]/service.go:165  Everything is ready. Begin running and processing data.
2024-02-12T11:16:00.553Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T11:16:28.489Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T11:16:29.468Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T11:16:34.296Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 1, "metrics": 1, "data points": 2}
2024-02-12T11:16:34.474Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T11:16:34.526Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T11:16:37.106Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 1, "metrics": 1, "data points": 2}
2024-02-12T11:16:37.775Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 26}
2024-02-12T11:16:39.767Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T11:16:39.797Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T11:16:40.466Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T11:16:40.784Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T11:16:40.796Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T11:16:41.813Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 2, "log records": 166}
2024-02-12T11:16:41.858Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T11:16:48.891Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T11:16:49.683Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 1, "log records": 1}
2024-02-12T11:16:50.587Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}

Result
Out of 1 million logs, only 516,000 were collected. No log-related errors were reported on the console

@marcinsiennicki95
Copy link
Author

marcinsiennicki95 commented Feb 12, 2024

Another test

Communication between 2 Instances of OpenTelemetry Collector with memory_limiter processors

First OpenTelemetry Collector configuration

receivers:
  filelog:
    include: [ C:/ERRORLOG ]
    encoding: utf-16le
    include_file_path: true
    max_log_size: 1MiB
    storage: file_storage
    poll_interval: 10s
    retry_on_failure:
        enabled: true
  
  hostmetrics:
    collection_interval: 2s
    scrapers:
      cpu:
      memory:

exporters:
  logging:

  otlphttp:
    endpoint: http://127.0.0.1:4320
    tls:
      insecure: true

extensions:
  file_storage:
    directory: C:/collector
    fsync: true

processors:
  batch:
    send_batch_size: 1000
  memory_limiter:
    check_interval: 1s
    limit_mib: 400
    spike_limit_mib: 250

service:
  extensions: [file_storage]
  pipelines:
    logs:
      receivers: [filelog]
      processors: [batch]
      exporters: [logging, otlphttp]
    metrics:
      receivers: [hostmetrics]
      processors: [batch]
      exporters: [logging, otlphttp]

Log output for first Collector

PS C:\Users\kCuraCloudAdmin> C:\\collector\\otelcol-contrib_0.93.0_windows_amd64\\otelcol-contrib.exe --config C:\\collector\\otelcol-contrib_0.93.0_windows_amd64\\config.yaml
2024-02-12T12:12:05.974Z        info    [email protected]/telemetry.go:76 Setting up own telemetry...
2024-02-12T12:12:05.980Z        info    [email protected]/telemetry.go:146        Serving metrics {"address": ":8888", "level": "Basic"}
2024-02-12T12:12:05.980Z        info    [email protected]/exporter.go:275        Deprecated component. Will be removed in future releases.       {"kind": "exporter", "data_type": "metrics", "name": "logging"}
2024-02-12T12:12:05.980Z        info    memorylimiter/memorylimiter.go:77       Memory limiter configured       {"kind": "processor", "name": "memory_limiter", "pipeline": "metrics", "limit_mib": 400, "spike_limit_mib": 250, "check_interval": 1}
2024-02-12T12:12:05.980Z        info    [email protected]/exporter.go:275        Deprecated component. Will be removed in future releases.       {"kind": "exporter", "data_type": "logs", "name": "logging"}
2024-02-12T12:12:05.980Z        info    [email protected]/service.go:139  Starting otelcol-contrib...     {"Version": "0.93.0", "NumCPU": 2}
2024-02-12T12:12:05.980Z        info    extensions/extensions.go:34     Starting extensions...
2024-02-12T12:12:05.980Z        info    extensions/extensions.go:37     Extension is starting...        {"kind": "extension", "name": "file_storage"}
2024-02-12T12:12:05.980Z        info    extensions/extensions.go:52     Extension started.      {"kind": "extension", "name": "file_storage"}
2024-02-12T12:12:05.995Z        info    adapter/receiver.go:45  Starting stanza receiver        {"kind": "receiver", "name": "filelog", "data_type": "logs"}
2024-02-12T12:12:06.033Z        info    fileconsumer/file.go:64 Resuming from previously known offset(s). 'start_at' setting is not applicable. {"kind": "receiver", "name": "filelog", "data_type": "logs", "component": "fileconsumer"}
2024-02-12T12:12:06.034Z        info    adapter/receiver.go:45  Starting stanza receiver        {"kind": "receiver", "name": "filelog/continues", "data_type": "logs"}
2024-02-12T12:12:06.072Z        warn    fileconsumer/file.go:51 finding files: no files match the configured criteria   {"kind": "receiver", "name": "filelog/continues", "data_type": "logs", "component": "fileconsumer"}
2024-02-12T12:12:06.088Z        info    [email protected]/service.go:165  Everything is ready. Begin running and processing data.
2024-02-12T12:12:07.229Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
"resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:12:25.077Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:12:25.078Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.
0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "5.5346304s"}
2024-02-12T12:12:27.150Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:12:27.322Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.
0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "6.706840488s"}
2024-02-12T12:12:27.475Z        info    [email protected]/collector.go:258        Received signal from OS {"signal": "interrupt"}
2024-02-12T12:12:27.475Z        info    [email protected]/service.go:179  Starting shutdown...
2024-02-12T12:12:27.479Z        info    adapter/receiver.go:140 Stopping stanza receiver        {"kind": "receiver", "name": "filelog/continues", "data_type": "logs"}
2024-02-12T12:12:27.496Z        info    adapter/receiver.go:140 Stopping stanza receiver        {"kind": "receiver", "name": "filelog", "data_type": "logs"}
2024-02-12T12:12:27.517Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "interrupted due to shutdown: failed to make an HTTP request: Post \"http://127.0.0.1
:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:12:27.523Z        info    extensions/extensions.go:59     Stopping extensions...
2024-02-12T12:12:27.523Z        info    [email protected]/service.go:193  Shutdown complete.
PS C:\Users\kCuraCloudAdmin> C:\\collector\\otelcol-contrib_0.93.0_windows_amd64\\otelcol-contrib.exe --config C:\\collector\\otelcol-contrib_0.93.0_windows_amd64\\config.yaml
2024-02-12T12:13:12.508Z        info    [email protected]/telemetry.go:76 Setting up own telemetry...
2024-02-12T12:13:12.508Z        info    [email protected]/telemetry.go:146        Serving metrics {"address": ":8888", "level": "Basic"}
2024-02-12T12:13:12.512Z        info    [email protected]/exporter.go:275        Deprecated component. Will be removed in future releases.       {"kind": "exporter", "data_type": "logs", "name": "logging"}
2024-02-12T12:13:12.513Z        info    [email protected]/exporter.go:275        Deprecated component. Will be removed in future releases.       {"kind": "exporter", "data_type": "metrics", "name": "logging"}
2024-02-12T12:13:12.514Z        info    memorylimiter/memorylimiter.go:77       Memory limiter configured       {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "limit_mib": 400, "spike_limit_mib": 250, "check_interval": 1}
2024-02-12T12:13:12.514Z        info    [email protected]/service.go:139  Starting otelcol-contrib...     {"Version": "0.93.0", "NumCPU": 2}
2024-02-12T12:13:12.515Z        info    extensions/extensions.go:34     Starting extensions...
2024-02-12T12:13:12.515Z        info    extensions/extensions.go:37     Extension is starting...        {"kind": "extension", "name": "file_storage"}
2024-02-12T12:13:12.515Z        info    extensions/extensions.go:52     Extension started.      {"kind": "extension", "name": "file_storage"}
2024-02-12T12:13:12.516Z        info    adapter/receiver.go:45  Starting stanza receiver        {"kind": "receiver", "name": "filelog", "data_type": "logs"}
2024-02-12T12:13:12.547Z        warn    fileconsumer/file.go:51 finding files: no files match the configured criteria   {"kind": "receiver", "name": "filelog", "data_type": "logs", "component": "fileconsumer"}
2024-02-12T12:13:12.567Z        info    adapter/receiver.go:45  Starting stanza receiver        {"kind": "receiver", "name": "filelog/continues", "data_type": "logs"}
2024-02-12T12:13:12.582Z        warn    fileconsumer/file.go:51 finding files: no files match the configured criteria   {"kind": "receiver", "name": "filelog/continues", "data_type": "logs", "component": "fileconsumer"}
2024-02-12T12:13:12.594Z        info    [email protected]/service.go:165  Everything is ready. Begin running and processing data.
2024-02-12T12:13:13.757Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:13.758Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "3.935162309s"}
2024-02-12T12:13:15.631Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:15.636Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "7.263222629s"}
2024-02-12T12:13:17.739Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:19.633Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:31.773Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:32.702Z        info    fileconsumer/file.go:268        Started watching file   {"kind": "receiver", "name": "filelog", "data_type": "logs", "component": "fileconsumer", "path": "C:\\ERRORLOG"}
2024-02-12T12:13:32.708Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:32.712Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:33.759Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:36.018Z        info    memorylimiter/memorylimiter.go:222      Memory usage is above soft limit. Forcing a GC. {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "cur_mem_mib": 155}
2024-02-12T12:13:36.087Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:36.708Z        error   scraperhelper/scrapercontroller.go:200  Error scraping metrics  {"kind": "receiver", "name": "hostmetrics", "data_type": "metrics", "error": "context deadline exceeded", "scraper": "cpu"}
go.opentelemetry.io/collector/receiver/scraperhelper.(*controller).scrapeMetricsAndReport
        go.opentelemetry.io/collector/[email protected]/scraperhelper/scrapercontroller.go:200
go.opentelemetry.io/collector/receiver/scraperhelper.(*controller).startScraping.func1
        go.opentelemetry.io/collector/[email protected]/scraperhelper/scrapercontroller.go:176
2024-02-12T12:13:36.714Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 1, "metrics": 1, "data points": 2}
2024-02-12T12:13:36.839Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:36.922Z        info    memorylimiter/memorylimiter.go:192      Memory usage after GC.  {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "cur_mem_mib": 130}
2024-02-12T12:13:36.923Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:37.532Z        warn    memorylimiter/memorylimiter.go:229      Memory usage is above soft limit. Refusing data.        {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "cur_mem_mib": 210}
2024-02-12T12:13:38.519Z        info    memorylimiter/memorylimiter.go:215      Memory usage back within limits. Resuming normal operation.     {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "cur_mem_mib": 101}
2024-02-12T12:13:39.664Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:41.769Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "logs", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/logs responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 1000}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createLogsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:13:42.239Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:42.505Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "logs", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/logs responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 1000}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createLogsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:13:43.294Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:43.308Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:43.695Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:44.224Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "logs", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/logs responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 1000}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createLogsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:13:44.297Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:44.305Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:44.964Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "logs", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/logs responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 1000}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createLogsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:13:45.313Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:45.331Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:45.606Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "logs", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/logs responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 1000}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createLogsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:13:45.767Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:47.641Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:49.727Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:51.612Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:52.831Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 1, "log records": 1}
2024-02-12T12:13:52.832Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "logs", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/logs responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 1}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createLogsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:13:53.720Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:01.684Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/metrics responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:14:03.768Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:03.770Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/metrics responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:14:05.664Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:05.665Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/metrics responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:14:07.741Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:07.743Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/metrics responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:14:09.612Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:09.614Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/metrics responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:14:11.702Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:11.704Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/metrics responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:14:13.578Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:13.581Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/metrics responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:14:15.669Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:15.671Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/metrics responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:14:17.758Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:17.760Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/metrics responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:14:19.633Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:19.637Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/metrics responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:14:21.735Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:21.737Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/metrics responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:14:23.598Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:23.600Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/metrics responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:14:25.682Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:25.687Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/metrics responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:14:27.767Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:27.768Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/metrics responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:14:29.636Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:29.637Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/metrics responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:14:31.714Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:31.716Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/metrics responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:14:33.598Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:33.600Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/metrics responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:14:35.669Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:35.670Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/metrics responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:14:37.743Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:37.745Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/metrics responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:14:39.619Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:39.625Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/metrics responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:14:41.708Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:41.710Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/metrics responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:14:43.580Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:43.583Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/metrics responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:14:45.642Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:45.643Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/metrics responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:14:47.729Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:47.731Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/metrics responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:14:49.605Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:49.606Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/metrics responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:14:51.686Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:51.688Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/metrics responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:14:53.763Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:53.764Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "2.528889573s"}
2024-02-12T12:14:55.635Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:55.637Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "4.469254533s"}
2024-02-12T12:14:56.312Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "3.781748409s"}
2024-02-12T12:14:57.723Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:57.724Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "5.922314888s"}
2024-02-12T12:14:59.590Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:59.591Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "5.101666595s"}
2024-02-12T12:15:00.106Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "7.260620386s"}
2024-02-12T12:15:00.124Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "10.609948295s"}
2024-02-12T12:15:01.673Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:15:01.673Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "6.600585342s"}
2024-02-12T12:15:03.661Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "3.893131851s"}
2024-02-12T12:15:03.758Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:15:03.761Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "5.631966261s"}
2024-02-12T12:15:04.714Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "7.862112453s"}
2024-02-12T12:15:05.622Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:15:05.622Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "6.964815392s"}
2024-02-12T12:15:07.381Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "12.618459834s"}
2024-02-12T12:15:07.558Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "11.67278761s"}
2024-02-12T12:15:07.698Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:15:07.699Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "4.90095434s"}
2024-02-12T12:15:08.280Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "7.703637679s"}
2024-02-12T12:15:09.409Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "7.030750379s"}
2024-02-12T12:15:09.771Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:15:09.773Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "6.588990315s"}
2024-02-12T12:15:10.748Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "16.49570526s"}
2024-02-12T12:15:10.864Z        info    [email protected]/collector.go:258        Received signal from OS {"signal": "interrupt"}
2024-02-12T12:15:10.864Z        info    [email protected]/service.go:179  Starting shutdown...
2024-02-12T12:15:10.869Z        info    adapter/receiver.go:140 Stopping stanza receiver        {"kind": "receiver", "name": "filelog/continues", "data_type": "logs"}
2024-02-12T12:15:10.884Z        info    adapter/receiver.go:140 Stopping stanza receiver        {"kind": "receiver", "name": "filelog", "data_type": "logs"}
2024-02-12T12:15:10.897Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "interrupted due to shutdown: failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:15:10.898Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "interrupted due to shutdown: failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:15:10.898Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "interrupted due to shutdown: failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:15:10.898Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "interrupted due to shutdown: failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:15:10.898Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "interrupted due to shutdown: failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:15:10.898Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "interrupted due to shutdown: failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:15:10.898Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "interrupted due to shutdown: failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:15:10.898Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "interrupted due to shutdown: failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:15:10.898Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "interrupted due to shutdown: failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43
2024-02-12T12:15:10.905Z        info    extensions/extensions.go:59     Stopping extensions...
2024-02-12T12:15:10.906Z        info    [email protected]/service.go:193  Shutdown complete.

Second OpenTelemetry Collector Configuration

receivers:  
  otlp:
    protocols:
      http:
        endpoint: "127.0.0.1:4320"

exporters:
  logging:

  otlphttp:
    endpoint: http://127.0.0.1:4318
    tls:
      insecure: true

processors:
  batch:
    send_batch_size: 1000
  memory_limiter:
    check_interval: 1s
    limit_mib: 400
    spike_limit_mib: 250

service:
  telemetry:
    metrics:
      address: "0.0.0.0:9090" # Use a port that you know is free
  pipelines:
    logs:
      receivers: [otlp]
      processors: [memory_limiter,batch]
      exporters: [logging, otlphttp]
    metrics:
      receivers: [otlp]
      processors: [memory_limiter,batch]
      exporters: [logging, otlphttp]

Log output for second Collector

PS C:\Users\kCuraCloudAdmin> C:\\collector\\second\\otelcol-contrib.exe --config C:\\collector\\second\\config.yaml
2024-02-12T12:13:15.717Z        info    [email protected]/telemetry.go:76 Setting up own telemetry...
2024-02-12T12:13:15.718Z        info    [email protected]/telemetry.go:146        Serving metrics {"address": "0.0.0.0:9090", "level": "Basic"}
2024-02-12T12:13:15.723Z        info    [email protected]/exporter.go:275        Deprecated component. Will be removed in future releases.       {"kind": "exporter", "data_type": "metrics", "name": "logging"}
2024-02-12T12:13:15.730Z        info    memorylimiter/memorylimiter.go:77       Memory limiter configured       {"kind": "processor", "name": "memory_limiter", "pipeline": "metrics", "limit_mib": 400, "spike_limit_mib": 250, "check_interval": 1}
2024-02-12T12:13:15.730Z        info    [email protected]/exporter.go:275        Deprecated component. Will be removed in future releases.       {"kind": "exporter", "data_type": "logs", "name": "logging"}
2024-02-12T12:13:15.730Z        info    [email protected]/service.go:139  Starting otelcol-contrib...     {"Version": "0.93.0", "NumCPU": 2}
2024-02-12T12:13:15.730Z        info    extensions/extensions.go:34     Starting extensions...
2024-02-12T12:13:15.734Z        info    [email protected]/otlp.go:152        Starting HTTP server    {"kind": "receiver", "name": "otlp", "data_type": "metrics", "endpoint": "127.0.0.1:4320"}
2024-02-12T12:13:15.739Z        info    [email protected]/service.go:165  Everything is ready. Begin running and processing data.
2024-02-12T12:13:17.806Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 4, "metrics": 4, "data points": 20}
2024-02-12T12:13:19.698Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:21.796Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:23.056Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:23.680Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:25.786Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:27.677Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:29.754Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:31.838Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:32.720Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:32.732Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:33.759Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:33.887Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:34.838Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 11, "log records": 1010}
2024-02-12T12:13:34.916Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:36.044Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:36.091Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:36.922Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 1, "metrics": 1, "data points": 2}
2024-02-12T12:13:37.077Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:37.098Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:38.733Z        info    memorylimiter/memorylimiter.go:222      Memory usage is above soft limit. Forcing a GC. {"kind": "processor", "name": "memory_limiter", "pipeline": "metrics", "cur_mem_mib": 164}
2024-02-12T12:13:39.139Z        info    memorylimiter/memorylimiter.go:192      Memory usage after GC.  {"kind": "processor", "name": "memory_limiter", "pipeline": "metrics", "cur_mem_mib": 107}
2024-02-12T12:13:39.806Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:41.734Z        warn    memorylimiter/memorylimiter.go:229      Memory usage is above soft limit. Refusing data.        {"kind": "processor", "name": "memory_limiter", "pipeline": "metrics", "cur_mem_mib": 150}
2024-02-12T12:13:42.732Z        info    memorylimiter/memorylimiter.go:215      Memory usage back within limits. Resuming normal operation.     {"kind": "processor", "name": "memory_limiter", "pipeline": "metrics", "cur_mem_mib": 132}
2024-02-12T12:13:42.742Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:42.750Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:43.734Z        warn    memorylimiter/memorylimiter.go:229      Memory usage is above soft limit. Refusing data.        {"kind": "processor", "name": "memory_limiter", "pipeline": "metrics", "cur_mem_mib": 228}
2024-02-12T12:13:43.908Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:52.290Z        info    [email protected]/collector.go:258        Received signal from OS {"signal": "interrupt"}
2024-02-12T12:14:52.290Z        info    [email protected]/service.go:179  Starting shutdown...
2024-02-12T12:14:52.305Z        info    extensions/extensions.go:59     Stopping extensions...
2024-02-12T12:14:52.305Z        info    [email protected]/service.go:193  Shutdown complete.

Results

Test Scenario

  1. First Collector Started
  2. Second Collector Started
  3. Move Errorlog (1mln logs) file to destination
  4. Turn off Second Collector
  5. Turn off First Collector

Initially, the OpenTelemetry Collector was unable to send metrics to the second Collector and encountered a retryable error.

2024-02-12T12:13:13.758Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "3.935162309s"}

After the second collector became available, metrics were successfully transmitted. When a log file was moved and processed by the first collector, both collectors reached their soft limits the same situation is when the reach hard limit. At this point, the second collector started rejecting data, redirecting it back to the receiver. As a result, the first OpenTelemetry Collector encountered errors for both logs and metrics.

2024-02-12T12:14:33.600Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/metrics responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/[email protected]/exporterhelper/internal/consumers.go:43

I expected that the first collector would obtain a recoverable error, such as "too many requests," so when the second collector resumes normal operation, pending requests can be resent instead of being dropped.

At the end, after shutting down the second collector, the first collector displayed an error

2024-02-12T12:15:09.773Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "6.588990315s"}

@djaglowski
Copy link
Member

I'd be curious to hear thoughts on how we address this. Do we need to remove the batching behavior in the adapter package? It's caused plenty of problems before but emitting every log in a separate plog.Logs is problematic as well.

@marcinsiennicki95
Copy link
Author

I found OpenTelemtry specification how http error codes should be handled. It looks like the current behavior is not implemented correctly, and the error codes are not followed

@djaglowski
Copy link
Member

Why would we need to implement http error codes in a file reading receiver?

@marcinsiennicki95
Copy link
Author

marcinsiennicki95 commented Feb 12, 2024

Sorry, I did not specify that it is not related to the file log receiver, but to the OLTP receiver

If I understand correctly, when setting up a chain like this: Collector 1 (client) with the OLTP exporter (or fluent-bit with output OLTP), and Collector 2 (server) with the OLTP receiver, the receiver on Collector 2 should handle HTTP status codes as specified in the specification. Am I right?

@djaglowski
Copy link
Member

Apologies @marcinsiennicki95, I got wires crossed with #31074.

I'm not sure how I can help here. I'm going to close this issue but recommend the following:

  1. If there are there any specific problems with the filelog receiver, please open a new issue for each problem you see and describe it in the narrowest terms possible.
  2. Likewise, consider opening issue(s) on the core collector repo to discuss any incorrect behaviors of the otlp components.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working needs triage New item requiring triage receiver/filelog
Projects
None yet
Development

No branches or pull requests

2 participants