Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The input beats plugin crashes if the document field has a special character #14291

Open
b2ronn opened this issue Jun 21, 2022 · 3 comments
Open

Comments

@b2ronn
Copy link

b2ronn commented Jun 21, 2022

one of the applications sends to APM labels of a non-standard format that contains

labels.[ project: "somename"

I can't remove this label on the logstash side, because crashes on the input plugin.
and it kills logstash with this error

[2022-06-21T22:06:27,319][INFO ][org.logstash.beats.BeatsHandler][main][apm] [local: 172.21.65.17:5044, remote: 172.23.56.22:43272] Handling exception: org.logstash.FieldReference$IllegalSyntaxException: Invalid FieldReference: `[ project` (caused by: org.logstash.FieldReference$IllegalSyntaxException: Invalid FieldReference: `[ project`)
[2022-06-21T22:06:27,328][INFO ][org.logstash.beats.BeatsHandler][main][apm] [local: 172.21.65.17:5044, remote: 172.23.56.22:43272] Handling exception: org.logstash.FieldReference$IllegalSyntaxException: Invalid FieldReference: `[ project` (caused by: org.logstash.FieldReference$IllegalSyntaxException: Invalid FieldReference: `[ project`)
[2022-06-21T22:06:27,329][INFO ][org.logstash.beats.BeatsHandler][main][apm] [local: 172.21.65.17:5044, remote: 172.23.56.22:43272] Handling exception: org.logstash.FieldReference$IllegalSyntaxException: Invalid FieldReference: `[ project` (caused by: org.logstash.FieldReference$IllegalSyntaxException: Invalid FieldReference: `[ project`)
[2022-06-21T22:06:27,320][WARN ][io.netty.channel.DefaultChannelPipeline][main][apm] An exceptionCaught() event was fired, and it reached at the tail of the pipeline. It usually means the last handler in the pipeline did not handle the exception.
org.logstash.FieldReference$IllegalSyntaxException: Invalid FieldReference: `[ project`
	at org.logstash.FieldReference$StrictTokenizer.tokenize(FieldReference.java:321) ~[logstash-core.jar:?]
	at org.logstash.FieldReference.parse(FieldReference.java:222) ~[logstash-core.jar:?]
	at org.logstash.FieldReference.parseToCache(FieldReference.java:213) ~[logstash-core.jar:?]
	at org.logstash.FieldReference.from(FieldReference.java:136) ~[logstash-core.jar:?]
	at org.logstash.ConvertedMap.put(ConvertedMap.java:94) ~[logstash-core.jar:?]
	at org.logstash.ConvertedMap.newFromMap(ConvertedMap.java:74) ~[logstash-core.jar:?]
	at org.logstash.Valuefier.lambda$initConverters$13(Valuefier.java:172) ~[logstash-core.jar:?]
	at org.logstash.Valuefier.convert(Valuefier.java:94) ~[logstash-core.jar:?]
	at org.logstash.ConvertedMap.newFromMap(ConvertedMap.java:74) ~[logstash-core.jar:?]
	at org.logstash.ext.JrubyEventExtLibrary$RubyEvent.initializeFallback(JrubyEventExtLibrary.java:333) ~[logstash-core.jar:?]
	at org.logstash.ext.JrubyEventExtLibrary$RubyEvent.ruby_initialize(JrubyEventExtLibrary.java:102) ~[logstash-core.jar:?]
	at org.logstash.ext.JrubyEventExtLibrary$RubyEvent$INVOKER$i$0$1$ruby_initialize.call(JrubyEventExtLibrary$RubyEvent$INVOKER$i$0$1$ruby_initialize.gen) ~[jruby.jar:?]
	at org.jruby.internal.runtime.methods.JavaMethod$JavaMethodN.call(JavaMethod.java:815) ~[jruby.jar:?]
	at org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:200) ~[jruby.jar:?]
	at org.jruby.internal.runtime.methods.JavaMethod$JavaMethodN.call(JavaMethod.java:833) ~[jruby.jar:?]
	at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:85) ~[jruby.jar:?]
	at org.jruby.RubyClass.newInstance(RubyClass.java:939) ~[jruby.jar:?]
	at org.jruby.RubyClass$INVOKER$i$newInstance.call(RubyClass$INVOKER$i$newInstance.gen) ~[jruby.jar:?]
	at org.jruby.ir.targets.InvokeSite.invoke(InvokeSite.java:207) ~[jruby.jar:?]
	at usr.share.logstash.logstash_minus_core.lib.logstash.plugins.event_factory_support.RUBY$method$new_event$0(/usr/share/logstash/logstash-core/lib/logstash/plugins/event_factory_support.rb:40) ~[?:?]
	at org.jruby.internal.runtime.methods.CompiledIRMethod.call(CompiledIRMethod.java:80) ~[jruby.jar:?]
	at org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:70) ~[jruby.jar:?]
	at org.jruby.ir.targets.InvokeSite.invoke(InvokeSite.java:207) ~[jruby.jar:?]
	at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_input_minus_beats_minus_6_dot_3_dot_1_minus_java.lib.logstash.inputs.beats.message_listener.RUBY$method$onNewMessage$0(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-beats-6.3.1-java/lib/logstash/inputs/beats/message_listener.rb:45) ~[?:?]
	at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_input_minus_beats_minus_6_dot_3_dot_1_minus_java.lib.logstash.inputs.beats.message_listener.RUBY$method$onNewMessage$0$__VARARGS__(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-beats-6.3.1-java/lib/logstash/inputs/beats/message_listener.rb:33) ~[?:?]
	at org.jruby.internal.runtime.methods.CompiledIRMethod.call(CompiledIRMethod.java:80) ~[jruby.jar:?]
	at org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:70) ~[jruby.jar:?]
	at org.jruby.gen.LogStash$$Inputs$$Beats$$MessageListener_1681600145.onNewMessage(org/jruby/gen/LogStash$$Inputs$$Beats$$MessageListener_1681600145.gen:13) ~[?:?]
	at org.logstash.beats.BeatsHandler.channelRead0(BeatsHandler.java:52) ~[logstash-input-beats-6.3.1.jar:?]
	at org.logstash.beats.BeatsHandler.channelRead0(BeatsHandler.java:12) ~[logstash-input-beats-6.3.1.jar:?]
	at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:99) ~[netty-all-4.1.65.Final.jar:4.1.65.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) ~[netty-all-4.1.65.Final.jar:4.1.65.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) ~[netty-all-4.1.65.Final.jar:4.1.65.Final]
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) ~[netty-all-4.1.65.Final.jar:4.1.65.Final]
	at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:324) ~[netty-all-4.1.65.Final.jar:4.1.65.Final]
	at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:311) ~[netty-all-4.1.65.Final.jar:4.1.65.Final]
	at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:432) ~[netty-all-4.1.65.Final.jar:4.1.65.Final]
	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:276) ~[netty-all-4.1.65.Final.jar:4.1.65.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) ~[netty-all-4.1.65.Final.jar:4.1.65.Final]
	at io.netty.channel.AbstractChannelHandlerContext.access$600(AbstractChannelHandlerContext.java:61) ~[netty-all-4.1.65.Final.jar:4.1.65.Final]
	at io.netty.channel.AbstractChannelHandlerContext$7.run(AbstractChannelHandlerContext.java:370) ~[netty-all-4.1.65.Final.jar:4.1.65.Final]
	at io.netty.util.concurrent.DefaultEventExecutor.run(DefaultEventExecutor.java:66) ~[netty-all-4.1.65.Final.jar:4.1.65.Final]
	at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989) [netty-all-4.1.65.Final.jar:4.1.65.Final]
	at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) [netty-all-4.1.65.Final.jar:4.1.65.Final]
	at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) [netty-all-4.1.65.Final.jar:4.1.65.Final]
	at java.lang.Thread.run(Thread.java:829) [?:?]

Logstash information:

Please include the following information:

  1. Logstash version (e.g. bin/logstash --version)
$ ./bin/logstash --version
Using bundled JDK: /usr/share/logstash/jdk
logstash 8.2.2
  1. Logstash installation source (e.g. built from source, with a package manager: DEB/RPM, expanded from tar or zip archive, docker)
image: 'docker.elastic.co/logstash/logstash:8.2.2'
  1. How is Logstash being run (e.g. as a service/service manager: systemd, upstart, etc. Via command line, docker/kubernetes)
    run in openshift 4.10.14

Plugins installed: (bin/logstash-plugin list --verbose)

$ ./bin/logstash-plugin list --verbose
Using bundled JDK: /usr/share/logstash/jdk
OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
logstash-codec-avro (3.3.1)
logstash-codec-cef (6.2.5)
logstash-codec-collectd (3.1.0)
logstash-codec-dots (3.0.6)
logstash-codec-edn (3.1.0)
logstash-codec-edn_lines (3.1.0)
logstash-codec-es_bulk (3.1.0)
logstash-codec-fluent (3.4.1)
logstash-codec-graphite (3.0.6)
logstash-codec-json (3.1.0)
logstash-codec-json_lines (3.1.0)
logstash-codec-line (3.1.1)
logstash-codec-msgpack (3.1.0)
logstash-codec-multiline (3.1.1)
logstash-codec-netflow (4.2.2)
logstash-codec-plain (3.1.0)
logstash-codec-rubydebug (3.1.0)
logstash-filter-aggregate (2.10.0)
logstash-filter-anonymize (3.0.6)
logstash-filter-cidr (3.1.3)
logstash-filter-clone (4.2.0)
logstash-filter-csv (3.1.1)
logstash-filter-date (3.1.14)
logstash-filter-de_dot (1.0.4)
logstash-filter-dissect (1.2.5)
logstash-filter-dns (3.1.5)
logstash-filter-drop (3.0.5)
logstash-filter-elasticsearch (3.11.1)
logstash-filter-fingerprint (3.3.2)
logstash-filter-geoip (7.2.12)
logstash-filter-grok (4.4.2)
logstash-filter-http (1.4.1)
logstash-filter-json (3.2.0)
logstash-filter-kv (4.7.0)
logstash-filter-memcached (1.1.0)
logstash-filter-metrics (4.0.7)
logstash-filter-mutate (3.5.6)
logstash-filter-prune (3.0.4)
logstash-filter-ruby (3.1.8)
logstash-filter-sleep (3.0.7)
logstash-filter-split (3.1.8)
logstash-filter-syslog_pri (3.1.1)
logstash-filter-throttle (4.0.4)
logstash-filter-translate (3.3.0)
logstash-filter-truncate (1.0.5)
logstash-filter-urldecode (3.0.6)
logstash-filter-useragent (3.3.3)
logstash-filter-uuid (3.0.5)
logstash-filter-xml (4.1.3)
logstash-input-azure_event_hubs (1.4.3)
logstash-input-beats (6.3.1)
└── logstash-input-elastic_agent (alias)
logstash-input-couchdb_changes (3.1.6)
logstash-input-dead_letter_queue (1.1.11)
logstash-input-elasticsearch (4.12.3)
logstash-input-exec (3.4.0)
logstash-input-file (4.4.2)
logstash-input-ganglia (3.1.4)
logstash-input-gelf (3.3.1)
logstash-input-generator (3.1.0)
logstash-input-graphite (3.0.6)
logstash-input-heartbeat (3.1.1)
logstash-input-http (3.5.1)
logstash-input-http_poller (5.3.0)
logstash-input-imap (3.2.0)
logstash-input-jms (3.2.1)
logstash-input-pipe (3.1.0)
logstash-input-redis (3.7.0)
logstash-input-s3 (3.8.3)
logstash-input-snmp (1.3.1)
logstash-input-snmptrap (3.1.0)
logstash-input-sqs (3.3.0)
logstash-input-stdin (3.4.0)
logstash-input-syslog (3.6.0)
logstash-input-tcp (6.2.7)
logstash-input-twitter (4.1.0)
logstash-input-udp (3.5.0)
logstash-input-unix (3.1.1)
logstash-integration-elastic_enterprise_search (2.2.1)
 ├── logstash-output-elastic_app_search
 └──  logstash-output-elastic_workplace_search
logstash-integration-jdbc (5.2.5)
 ├── logstash-input-jdbc
 ├── logstash-filter-jdbc_streaming
 └── logstash-filter-jdbc_static
logstash-integration-kafka (10.10.0)
 ├── logstash-input-kafka
 └── logstash-output-kafka
logstash-integration-rabbitmq (7.3.0)
 ├── logstash-input-rabbitmq
 └── logstash-output-rabbitmq
logstash-output-cloudwatch (3.0.10)
logstash-output-csv (3.0.8)
logstash-output-elasticsearch (11.4.1)
logstash-output-email (4.1.1)
logstash-output-file (4.3.0)
logstash-output-graphite (3.1.6)
logstash-output-http (5.5.0)
logstash-output-lumberjack (3.1.9)
logstash-output-nagios (3.0.6)
logstash-output-null (3.0.5)
logstash-output-pipe (3.0.6)
logstash-output-redis (5.0.0)
logstash-output-s3 (4.3.5)
logstash-output-sns (4.0.8)
logstash-output-sqs (6.0.0)
logstash-output-stdout (3.1.4)
logstash-output-tcp (6.0.2)
logstash-output-udp (3.2.0)
logstash-output-webhdfs (3.0.6)
logstash-patterns-core (4.3.3)

JVM (e.g. java -version):

$ /usr/share/logstash/jdk/bin/java -version
openjdk version "11.0.14.1" 2022-02-08
OpenJDK Runtime Environment Temurin-11.0.14.1+1 (build 11.0.14.1+1)
OpenJDK 64-Bit Server VM Temurin-11.0.14.1+1 (build 11.0.14.1+1, mixed mode)

installed APM server via Elasticsearch (ECK) Operator 2.1.0 provided by Elastic

Steps to reproduce:
Elasticsearch

apiVersion: elasticsearch.k8s.elastic.co/v1
kind: Elasticsearch
metadata:
  name: elasticsearch
spec:
  auth: {}
  http:
    service:
      metadata: {}
      spec: {}
    tls:
      certificate: {}
  monitoring:
    logs: {}
    metrics: {}
  nodeSets:
    - config:
        node.roles:
          - master
          - data
          - remote_cluster_client
      count: 3
      name: master-data
      podTemplate:
        metadata:
          creationTimestamp: null
        spec:
          containers:
            - name: elasticsearch
              resources:
                limits:
                  cpu: '2'
                  memory: 8Gi
                requests:
                  cpu: '2'
                  memory: 8Gi
      volumeClaimTemplates:
        - metadata:
            name: elasticsearch-data
          spec:
            accessModes:
              - ReadWriteOnce
            resources:
              requests:
                storage: 800Gi
          status: {}
    - config:
        node.roles:
          - remote_cluster_client
          - ingest
          - transform
      count: 2
      name: coordinator
      podTemplate:
        metadata:
          creationTimestamp: null
        spec:
          containers:
            - name: elasticsearch
              resources:
                limits:
                  cpu: '2'
                  memory: 8Gi
                requests:
                  cpu: '2'
                  memory: 8Gi
      volumeClaimTemplates:
        - metadata:
            name: elasticsearch-data
          spec:
            accessModes:
              - ReadWriteOnce
            resources:
              requests:
                storage: 50Gi
          status: {}
  transport:
    service:
      metadata: {}
      spec: {}
    tls:
      certificate: {}
  updateStrategy:
    changeBudget: {}
  version: 8.2.0

APM-server

apiVersion: apm.k8s.elastic.co/v1
kind: ApmServer
metadata:
  name: apmserver-logstash
spec:
  config:
    apm-server.aggregation.transactions.max_groups: 10000
    queue.mem.flush.timeout: 1s
    apm-server.rum.enabled: true
    max_procs: 1
    apm-server.agent.config.cache.expiration: 30s
    apm-server.aggregation.transactions.hdrhistogram_significant_figures: 2
    output.elasticsearch.enabled: false
    apm-server.auth.api_key.enabled: false
    apm-server.auth.anonymous.allow_agent:
      - rum-js
      - js-base
      - java
      - dotnet
      - php
      - opentelemetry/cpp
      - python
    apm-server.write_timeout: 100s
    queue.mem.flush.min_events: 0
    apm-server.auth.anonymous.rate_limit.ip_limit: 10000
    secret_token: '${SECRET_TOKEN}'
    apm-server.ssl.supported_protocols:
      - TLSv1.3
      - TLSv1.2
      - TLSv1.1
      - TLSv1.0
    queue.mem.events: 20000
    apm-server.auth.anonymous.rate_limit.event_limit: 20000
    apm-server.aggregation.transactions.enabled: true
    apm-server.idle_timeout: 90s
    apm-server.shutdown_timeout: 5s
    apm-server.capture_personal_data: true
    apm-server.read_timeout: 100s
    logging.level: warning
    apm-server.kibana.ssl.certificate_authorities:
      - /usr/share/apm-server/config/kibana-certs/tls.crt
    apm-server.aggregation.transactions.interval: 1m
    apm-server.auth.anonymous.enabled: true
    apm-server.rum.allow_origins:
      - '*'
    apm-server.kibana.username: elastic
    apm-server.max_header_size: 485760
    apm-server.kibana.password: '${ELASTIC_PASS}'
    apm-server.auth.api_key.limit: 100
    apm-server.max_event_size: 1457600
    output.logstash:
      hosts:
        - 'logstash-headless:5044'
    apm-server.max_connections: 0
  count: 1
  elasticsearchRef:
    name: elasticsearch
  http:
    service:
      metadata:
        annotations:
          service.beta.openshift.io/serving-cert-secret-name: apmserver-logstash-openshift-tls
      spec: {}
    tls:
      certificate:
        secretName: apmserver-openshift-tls
  kibanaRef:
    name: kibana
  podTemplate:
    metadata:
      creationTimestamp: null
    spec:
      containers:
        - env:
            - name: ELASTIC_PASS
              valueFrom:
                secretKeyRef:
                  key: elastic
                  name: elasticsearch-es-elastic-user
          name: apm-server
          resources:
            limits:
              cpu: '1'
              memory: 2Gi
            requests:
              cpu: 100m
              memory: 100Mi
  version: 8.2.0

Logstash

kind: StatefulSet
apiVersion: apps/v1
metadata:
  name: logstash
  labels:
    app.kubernetes.io/component: logstash
    app.kubernetes.io/name: logstash
    app.kubernetes.io/part-of: elastic-stack
spec:
  replicas: 1
  selector:
    matchLabels:
      app.kubernetes.io/name: logstash
  template:
    metadata:
      name: logstash
      creationTimestamp: null
      labels:
        app.kubernetes.io/component: logstash
        app.kubernetes.io/name: logstash
        app.kubernetes.io/part-of: elastic-stack
    spec:
      restartPolicy: Always
      serviceAccountName: logstash
      serviceAccount: logstash
      schedulerName: default-scheduler
      enableServiceLinks: true
      terminationGracePeriodSeconds: 30
      securityContext:
        runAsUser: 1000
        runAsGroup: 1000
        fsGroup: 1000
      containers:
        - resources:
            limits:
              cpu: '2'
              memory: 2Gi
            requests:
              cpu: '2'
              memory: 2Gi
          readinessProbe:
            httpGet:
              path: /
              port: http
              scheme: HTTP
            initialDelaySeconds: 60
            timeoutSeconds: 5
            periodSeconds: 10
            successThreshold: 3
            failureThreshold: 3
          terminationMessagePath: /dev/termination-log
          name: logstash
          livenessProbe:
            httpGet:
              path: /
              port: http
              scheme: HTTP
            initialDelaySeconds: 300
            timeoutSeconds: 5
            periodSeconds: 10
            successThreshold: 1
            failureThreshold: 3
          env:
            - name: MONITORING_ENABLED
              value: 'False'
            - name: LS_JAVA_OPTS
              value: '-Xmx1g -Xms1g'
            - name: CHECKSUM
              value: '16'
          securityContext:
            capabilities:
              drop:
                - ALL
          ports:
            - containerPort: 5044
              protocol: TCP
            - name: http
              containerPort: 9600
              protocol: TCP
          imagePullPolicy: IfNotPresent
          volumeMounts:
            - name: logstash-data
              mountPath: /usr/share/logstash/data
            - name: logstash-configs
              mountPath: /usr/share/logstash/config/logstash.yml
              subPath: logstash.yml
            - name: logstash-pipeline
              mountPath: /usr/share/logstash/pipeline
          terminationMessagePolicy: File
          image: 'docker.elastic.co/logstash/logstash:8.2.2'
      volumes:
        - name: logstash-configs
          configMap:
            name: logstash-configmap
            items:
              - key: logstash.yml
                path: logstash.yml
        - name: logstash-pipeline
          configMap:
            name: logstash-configmap-pipeline
      dnsPolicy: ClusterFirst
      tolerations:
        - key: node-role.kubernetes.io/logging
          operator: Exists
          effect: NoSchedule
  volumeClaimTemplates:
    - kind: PersistentVolumeClaim
      apiVersion: v1
      metadata:
        name: logstash-data
        creationTimestamp: null
      spec:
        accessModes:
          - ReadWriteOnce
        resources:
          requests:
            storage: 50Gi
        volumeMode: Filesystem
      status:
        phase: Pending
  serviceName: logstash-headless
  podManagementPolicy: Parallel
  updateStrategy:
    type: RollingUpdate
  revisionHistoryLimit: 10
kind: ConfigMap
apiVersion: v1
metadata:
  name: logstash-configmap
data:
  logstash.yml: >-
    http.host: 0.0.0.0
    path.config: /usr/share/logstash/pipeline
    xpack.monitoring.enabled: true
    xpack.monitoring.elasticsearch.hosts:   ['https://elasticsearch-es-http:9200']
    xpack.monitoring.elasticsearch.username: "elastic"
    xpack.monitoring.elasticsearch.password: "****************"
    xpack.monitoring.elasticsearch.ssl.verification_mode: none
    pipeline.id: main
    pipeline.ecs_compatibility: v8
kind: ConfigMap
apiVersion: v1
metadata:
  name: logstash-configmap-pipeline
data:
  logstash.conf: |-
    input {
      beats {
        id => "apm"
        ecs_compatibility => "v8"
        host => "0.0.0.0"
        include_codec_tag => false
        port => 5044
        ssl => false
      }
    }

    output {
        elasticsearch {
          ssl_certificate_verification => "false"
          data_stream => "true"
          user => "elastic"
          password => "******"
          hosts => "https://elasticsearch-es-http:9200"
        }
    }

Test Petclinic App

kind: Deployment
apiVersion: apps/v1
metadata:
  name: petclinic
  labels:
    app.kubernetes.io/component: test-application
    app.kubernetes.io/name: petclinic
    app.kubernetes.io/part-of: elastic-stack
spec:
  replicas: 1
  selector:
    matchLabels:
      app.kubernetes.io/component: test-application
      app.kubernetes.io/name: petclinic
      app.kubernetes.io/part-of: elastic-stack
  template:
    metadata:
      creationTimestamp: null
      labels:
        app.kubernetes.io/component: test-application
        app.kubernetes.io/name: petclinic
        app.kubernetes.io/part-of: elastic-stack
    spec:
      volumes:
        - name: elastic-apm-agent
          emptyDir: {}
      initContainers:
        - name: elastic-java-agent
          image: 'docker.elastic.co/observability/apm-agent-java:1.31.0'
          command:
            - cp
            - '-v'
            - /usr/agent/elastic-apm-agent.jar
            - /elastic/apm/agent
          resources: {}
          volumeMounts:
            - name: elastic-apm-agent
              mountPath: /elastic/apm/agent
          terminationMessagePath: /dev/termination-log
          terminationMessagePolicy: File
          imagePullPolicy: IfNotPresent
      containers:
        - resources: {}
          terminationMessagePath: /dev/termination-log
          name: petclinic
          env:
            - name: ELASTIC_APM_SERVER_URL
              value: 'https://apmserver-logstash-apm-http:8200'
            - name: ELASTIC_APM_SERVICE_NAME
              value: logging-test-app
            - name: ELASTIC_APM_APPLICATION_PACKAGES
              value: org.springframework.samples.petclinic
            - name: ELASTIC_APM_ENVIRONMENT
              value: dev
            - name: JAVA_TOOL_OPTIONS
              value: '-javaagent:/elastic/apm/agent/elastic-apm-agent.jar'
            - name: KUBERNETES_NAMESPACE
              valueFrom:
                fieldRef:
                  apiVersion: v1
                  fieldPath: metadata.namespace
            - name: ELASTIC_APM_VERIFY_SERVER_CERT
              value: 'false'
            - name: ELASTIC_APM_GLOBAL_LABELS
              value: '[ project=testtesttest'
          ports:
            - name: petclinic-http
              containerPort: 8080
              protocol: TCP
          imagePullPolicy: IfNotPresent
          volumeMounts:
            - name: elastic-apm-agent
              mountPath: /elastic/apm/agent
          terminationMessagePolicy: File
          image: 'arey/springboot-petclinic:latest'
      restartPolicy: Always
      terminationGracePeriodSeconds: 30
      dnsPolicy: ClusterFirst
      securityContext: {}
      schedulerName: default-scheduler
@b2ronn b2ronn changed the title The input beats plugin crashes if the document key has a special character The input beats plugin crashes if the document field has a special character Jun 21, 2022
@yaauie
Copy link
Member

yaauie commented Jun 22, 2022

A fix for this is included in the in-flight release of 8.3.0 -> #14044

@yaauie
Copy link
Member

yaauie commented Jun 22, 2022

The source of your problem appears to be in the Test Petclinic App definition, which configures APM to set a field with key [ project. Prior to Logstash 8.3.0, this field cannot be created and results in the crash you experienced. After Logstash 8.3.0, the field is created but cannot be referenced by the pipeline unless you opt into one of the field reference escape styles provided by #14044.

              name: ELASTIC_APM_GLOBAL_LABELS
              value: '[ project=testtesttest'

@bczifra
Copy link
Member

bczifra commented Jul 6, 2023

Note that the source ip of the host submitting the faulty data to Logstash can be seen in the INFO message (172.23.56.22 in this case) that precedes the WARN message:

[2022-06-21T22:06:27,329][INFO ][org.logstash.beats.BeatsHandler][main][apm] [local: 172.21.65.17:5044, remote: 172.23.56.22:43272] Handling exception: org.logstash.FieldReference$IllegalSyntaxException: Invalid FieldReference: `[ project` (caused by: org.logstash.FieldReference$IllegalSyntaxException: Invalid FieldReference: `[ project`)

If the log level is set to warn or higher, such as by setting log.level: "warn" in logstash.yml, this message won't be logged.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants