Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BatchSpanProcessor Cause a memory leak #1904

Closed
tensorchen opened this issue May 12, 2021 · 1 comment
Closed

BatchSpanProcessor Cause a memory leak #1904

tensorchen opened this issue May 12, 2021 · 1 comment
Labels
bug Something isn't working
Milestone

Comments

@tensorchen
Copy link
Member

tensorchen commented May 12, 2021

Description

BatchSpanProcessor Cause a memory leak

Environment

  • OS: [MacOS]
  • Go Version: [e.g. 1.16]
  • opentelemetry-go version: [v0.20.0]
  • opentelemetry-go-contrib-otelgrpc [v0.20.0]

Steps To Reproduce

  1. Change opentelemetry-go-contrib-otelgrpc config.Init() to the code:
driver := otlpgrpc.NewDriver(otlpgrpc.WithEndpoint("127.0.0.1:12520"))
	exporter, err := otlp.NewExporter(context.Background(), driver)
	if err != nil {
		log.Fatal(err)
	}
	kvs := []attribute.KeyValue{
		semconv.TelemetrySDKLanguageGo,
		semconv.TelemetrySDKNameKey.String("xxx"),
	}
	res := resource.NewWithAttributes(kvs...)
	tp := sdktrace.NewTracerProvider(
		sdktrace.WithSampler(sdktrace.AlwaysSample()),
		sdktrace.WithBatcher(exporter),
		sdktrace.WithResource(res),
	)
	otel.SetTracerProvider(tp)
	otel.SetTextMapPropagator(propagation.NewCompositeTextMapPropagator(propagation.TraceContext{}, propagation.Baggage{}))
  1. start grpc server

  2. start grpc client

ghz --insecure --proto hello-service.proto --call api.HelloService.SayHello -d '{"greeting":"tensorchen"}' -z 20m 127.0.0.1:7777
  1. Result

image

image

@tensorchen tensorchen added the bug Something isn't working label May 12, 2021
@tensorchen tensorchen changed the title BatchSpanProcessor + Resource Cause a memory leak BatchSpanProcessor Cause a memory leak May 12, 2021
@tianyaqu
Copy link
Contributor

tianyaqu commented May 14, 2021

Duplicate with #1833 and resoleved by #1860. The cause of the leak is batch span processor will never drop the bathced spans when get an exporting failure, thus making the size of batched spans larger than the grpc message body. The batched spans will never get the chance to be successfully exported and its size grows endlessly.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
Archived in project
Development

No branches or pull requests

3 participants