Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Inconsistent logging format #2076

Open
havard024 opened this issue Sep 24, 2022 · 1 comment
Open

[Bug]: Inconsistent logging format #2076

havard024 opened this issue Sep 24, 2022 · 1 comment
Labels
bug Something isn't working

Comments

@havard024
Copy link

What happened?

jaeger operator and jaeger logs are interpreted as errors in our centralised logging system.

Below are four different log lines from the jaeger operator, I've also attached a screenshot from our centralised logging system showing the same

I0924 08:19:39.985099       1 leaderelection.go:248] attempting to acquire leader lease observability/31e04290.jaegertracing.io...
2022/09/24 08:19:39 <nil>
1.664007579984811e+09	INFO	Starting server	{"kind": "health probe", "addr": "[::]:8081"}
1.6640075835232968e+09	INFO	Stopping and waiting for non leader election runnables

Below are three different log lines from the jaeger deployment, I've also attached a screenshot from our centralized logging system showing the same

badger 2022/09/24 08:11:41 INFO: Set nextTxnTs to 0
2022/09/24 08:11:41 maxprocs: Leaving GOMAXPROCS=2: CPU quota undefined
{"level":"warn","ts":1664009833.0241213,"caller":"channelz/funcs.go:342","msg":"[core][Channel #10 SubChannel #11] grpc: addrConn.createTransport failed to connect to {\n  \"Addr\": \":16685\",\n  \"ServerName\": \"localhost:16685\",\n  \"Attributes\": null,\n  \"BalancerAttributes\": null,\n  \"Type\": 0,\n  \"Metadata\": null\n}. Err: connection error: desc = \"transport: Error while dialing dial tcp :16685: connect: connection refused\"","system":"grpc","grpc_log":true}

I also see these lines from the all-in-one image though I was unable to figure out which container they come from

Level 6 [ ]: NumTables: 83. Size: 467 MiB of 467 MiB. Score: 0.00->0.00 StaleData: 0 B Target FileSize: 8.0 MiB
{"msg":"Shutdown complete","caller":"flags/service.go:164","ts":1664007095.592402,"level":"info"}

Our setup:

  • Our cluster is running on eks with kubectl version 1.23
  • We are sending our logs to datadog

Here's another issue which looks pretty similar #1817

Steps to reproduce

Install jaeger operator 1.37 with the supplied Jaeger manifest
Check the jaeger operator logs and the jaeger deployment logs

Expected behavior

Seeing the same format of logs on every line
Being able to change the log level globally

Relevant log output

jaeger operator logs
I0924 08:19:39.271244       1 request.go:601] Waited for 1.021218758s due to client-side throttling, not priority and fairness, request: GET:https://172.20.0.1:443/apis/snapshot.storage.k8s.io/v1?timeout=32s
1.6640075799272454e+09  INFO    controller-runtime.metrics      Metrics server is starting to listen    {"addr": "0.0.0.0:8383"}
1.6640075799818969e+09  INFO    controller-runtime.builder      Registering a mutating webhook  {"GVK": "jaegertracing.io/v1, Kind=Jaeger", "path": "/mutate-jaegertracing-io-v1-jaeger"}
1.6640075799822905e+09  INFO    controller-runtime.webhook      Registering webhook     {"path": "/mutate-jaegertracing-io-v1-jaeger"}
1.6640075799825728e+09  INFO    controller-runtime.builder      Registering a validating webhook        {"GVK": "jaegertracing.io/v1, Kind=Jaeger", "path": "/validate-jaegertracing-io-v1-jaeger"}
1.6640075799827461e+09  INFO    controller-runtime.webhook      Registering webhook     {"path": "/validate-jaegertracing-io-v1-jaeger"}
1.6640075799829955e+09  INFO    controller-runtime.webhook      Registering webhook     {"path": "/mutate-v1-deployment"}
2022/09/24 08:19:39 <nil>
1.664007579983556e+09   INFO    controller-runtime.webhook.webhooks     Starting webhook server
1.6640075799839525e+09  INFO    controller-runtime.certwatcher  Updated current TLS certificate
1.6640075799842384e+09  INFO    controller-runtime.webhook      Serving webhook server  {"host": "", "port": 9443}
1.6640075799844604e+09  INFO    controller-runtime.certwatcher  Starting certificate watcher
1.664007579984675e+09   INFO    Starting server {"path": "/metrics", "kind": "metrics", "addr": "[::]:8383"}
1.664007579984811e+09   INFO    Starting server {"kind": "health probe", "addr": "[::]:8081"}
I0924 08:19:39.985099       1 leaderelection.go:248] attempting to acquire leader lease observability/31e04290.jaegertracing.io...
I0924 08:22:13.015968       1 leaderelection.go:258] successfully acquired lease observability/31e04290.jaegertracing.io
1.6640077330196195e+09  DEBUG   events  Normal  {"object": {"kind":"Lease","namespace":"observability","name":"31e04290.jaegertracing.io","uid":"f6b72918-d5c6-4e3d-ad58-2700d2c80250","apiVersion":"coordination.k8s.io/v1","resourceVersion":"16741521"}, "reason": "LeaderElection", "message": "jaeger-operator-6b6587468-jxzkp_3d3de6ed-fa93-4718-8878-d53e9a72f1b4 became leader"}
1.6640077330233665e+09  INFO    Starting EventSource    {"controller": "jaeger", "controllerGroup": "jaegertracing.io", "controllerKind": "Jaeger", "source": "kind source: *v1.Jaeger"}
1.6640077330234199e+09  INFO    Starting EventSource    {"controller": "jaeger", "controllerGroup": "jaegertracing.io", "controllerKind": "Jaeger", "source": "kind source: *v1.Jaeger"}
1.6640077330234263e+09  INFO    Starting Controller     {"controller": "jaeger", "controllerGroup": "jaegertracing.io", "controllerKind": "Jaeger"}
1.664007733023863e+09   INFO    Starting EventSource    {"controller": "namespace", "controllerGroup": "", "controllerKind": "Namespace", "source": "kind source: *v1.Namespace"}
1.664007733023894e+09   INFO    Starting Controller     {"controller": "namespace", "controllerGroup": "", "controllerKind": "Namespace"}
1.6640077331251454e+09  INFO    Starting workers        {"controller": "namespace", "controllerGroup": "", "controllerKind": "Namespace", "worker count": 1}
1.6640077331252797e+09  INFO    Starting workers        {"controller": "jaeger", "controllerGroup": "jaegertracing.io", "controllerKind": "Jaeger", "worker count": 1}
1.6640077333197255e+09  DEBUG   controller-runtime.webhook.webhooks     received request        {"webhook": "/mutate-v1-deployment", "UID": "9a2c2417-61f0-4af4-a997-a58dcdb78719", "kind": "apps/v1, Kind=Deployment", "resource": {"group":"apps","version":"v1","resource":"deployments"}}
1.6640077333219082e+09  DEBUG   controller-runtime.webhook.webhooks     wrote response  {"webhook": "/mutate-v1-deployment", "code": 200, "reason": "is jaeger deployment, we do not touch it", "UID": "9a2c2417-61f0-4af4-a997-a58dcdb78719", "allowed": true}
1.6640077334424644e+09  INFO    KubeAPIWarningLogger    autoscaling/v2beta2 HorizontalPodAutoscaler is deprecated in v1.23+, unavailable in v1.26+; use autoscaling/v2 HorizontalPodAutoscaler
1.6640078141597042e+09  DEBUG   controller-runtime.webhook.webhooks     received request        {"webhook": "/mutate-v1-deployment", "UID": "a5521e8c-8cef-49a5-978d-c567bae10bf9", "kind": "apps/v1, Kind=Deployment", "resource": {"group":"apps","version":"v1","resource":"deployments"}}
1.6640078141605399e+09  DEBUG   controller-runtime.webhook.webhooks     wrote response  {"webhook": "/mutate-v1-deployment", "code": 200, "reason": "no action needed", "UID": "a5521e8c-8cef-49a5-978d-c567bae10bf9", "allowed": true}
1.6640078141816647e+09  DEBUG   controller-runtime.webhook.webhooks     received request        {"webhook": "/mutate-v1-deployment", "UID": "5ef1b29d-6b23-4f21-8ef8-ed829d06f4ed", "kind": "apps/v1, Kind=Deployment", "resource": {"group":"apps","version":"v1","resource":"deployments"}}
1.6640078141825016e+09  DEBUG   controller-runtime.webhook.webhooks     wrote response  {"webhook": "/mutate-v1-deployment", "code": 200, "reason": "no action needed", "UID": "5ef1b29d-6b23-4f21-8ef8-ed829d06f4ed", "allowed": true}
1.664007814204585e+09   DEBUG   controller-runtime.webhook.webhooks     received request        {"webhook": "/mutate-v1-deployment", "UID": "ea56d1c5-da93-428b-823e-e5fbe736752d", "kind": "apps/v1, Kind=Deployment", "resource": {"group":"apps","version":"v1","resource":"deployments"}}

Screenshot

jaeger
Screenshot 2022-09-24 at 10 29 00

jaeger operator
Screenshot 2022-09-24 at 10 29 26

Additional context

No response

Jaeger backend version

v1.37.0

SDK

No response

Pipeline

No response

Stogage backend

No response

Operating system

linux

Deployment model

kubernetes

Deployment configs

We have deployed the following manifests https://github.com/jaegertracing/jaeger-operator/releases/download/v1.37.0/jaeger-operator.yaml
Below is our Jaeger manifest

apiVersion: jaegertracing.io/v1
kind: Jaeger
metadata:
  name: simplest
  namespace: istio-system
spec:
  allInOne:
    options:
      log-level: warn
      query:
        base-path: /jaeger
      collector:
        zipkin:
          host-port: 9411
  ingress:
    enabled: false
  storage:
    type: badger
    options:
      memory:
        max-traces: "50000"
      badger:
        ephemeral: false
        directory-key: "/badger/key"
        directory-value: "/badger/data"
@havard024 havard024 added the bug Something isn't working label Sep 24, 2022
@iblancasa
Copy link
Collaborator

I checked and are log messages from the controllerruntime package. Probably we're not setting properly one logger or something. I'll try to do a deeper research :)

@iblancasa iblancasa removed their assignment Sep 16, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants