Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Frontend protobuf encoding breaks X-Loki-Response-Encoding-Flags: categorize-labels header #14053

Closed
trevorwhitney opened this issue Sep 4, 2024 · 5 comments · Fixed by #14065

Comments

@trevorwhitney
Copy link
Collaborator

trevorwhitney commented Sep 4, 2024

Describe the bug
A request to /loki/api/v1/query_range with the header X-Loki-Response-Encoding-Flags: categorize-labels has broken behavior when frontend.encoding is set to protobuf.

There are also reports of this being broken when using the default json encoding, though I was not able to reproduce that locally using 4f962ef7af250fc347dbed15583787d0238f6e9f.

To Reproduce
Steps to reproduce the behavior:

  1. Started Loki (main @ 4f962ef7af250fc347dbed15583787d0238f6e9f) using the config in ./cmd/loki/loki-local-config.yaml
  2. Send logs to loki using the generate logs task in explore-logs, ie yarn generate-logs
  3. Query for {cluster="eu-west-1", namespace="tempo-dev", service_name="tempo-distributor", level="warn"}, results are normal.
  4. Query for {cluster="eu-west-1", namespace="tempo-dev", service_name="tempo-distributor", level="warn"} with X-Loki-Response-Encoding-Flags: categorize-labels, and there appears to be weird binary strings in the structuredMetadata of the categorized response, for example
          [
            "1725480391800168000",
            "level=debug ts=2024-09-04T14:06:31.800168-06:00 caller=broadcast.go:48 msg=\"Invalidating forwarded broadcast\" key=collectors/compactor version=65 oldVersion=38 content=[compactor-8d1vr] oldContent=[compactor-v3n2w]",
            {
              "structuredMetadata": {
                "\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000": "\u0000\u0000\u0000\u0000\u0000"
              }
            }
          ]
  1. Change ./cmd/loki/loki-local-config.yaml to remove the config frontend.encoding = protobuf and restart Loki. This will cause the default JSON encoding to be used
  2. Query for {cluster="eu-west-1", namespace="tempo-dev", service_name="tempo-distributor", level="warn"}, results are normal.
  3. Query for {cluster="eu-west-1", namespace="tempo-dev", service_name="tempo-distributor", level="warn"} with X-Loki-Response-Encoding-Flags: categorize-labels, results are normal, with labels correctly categorized

Expected behavior
I would expect the results to be the same when providing the X-Loki-Repsonse-Encoding-Flags using either protobuf or json encoding.

Environment:
running locally, thought it has been reported on multiple cloud deployments as well

Screenshots, Promtail config, or terminal output
If applicable, add any output to help explain your problem.

@trevorwhitney
Copy link
Collaborator Author

@salvacorts I know you worked on the header, and @jeschkies and @cstyan I think you two added the encoding flag. Any ideas on this one?

@trevorwhitney
Copy link
Collaborator Author

From my investigation the response looked fine in the querier and was then corrupted in the frontend, which made me guess to toggle the encoding flag.

@gtk-grafana
Copy link
Contributor

Looks like this issue is a regression from #13998, and not related to the frontend encoding? I'm able to reproduce the issue on various loki versions after #13998, independent of json or protobuf encoding

@na--
Copy link
Member

na-- commented Sep 10, 2024

I am reopening this because we haven't actually fixed the problem. We still don't know why the upgrade to grpc-go v1.66.0 causes this issue, and we can't block upgrading that dependency indefinitely...

@na--
Copy link
Member

na-- commented Oct 22, 2024

This was resolved by #14216

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants