You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am pretty sure this is still an issue or something else in the golang otel ecosystem. I will get a pprof setup possibly tomorrow, but here's some anecdotal evidence I have:
Pretty easy to see when tracing was implemented from that graph. And yes. I have removed our tracing implementation and it's back to normal memory usage.
Here is a rough draft of our setup. Please let me know if I am doing anything egregiously dumb, but for the most part, it's all pretty standard stuff take from various docs:
We are wrapping the otelhttp.NewHandler around the toplevel muxer, so everything is traced. Yes, I know this is expensive, but it shouldn't leak memory. Eventually we will change this to include/exclude/drop stuff, just so we aren't taking in so much volume (ping routes, health checks, etc.) and do more aggressive down sampling.
Here is how we are initializing our trace and metrics providers once on boot:
// TracerProvider an OTLP exporter, and configures the corresponding trace provider.funcTracerProvider(ctx context.Context, res*resource.Resource) (func(context.Context) error, error) {
// Set up a trace exportertraceExporter, err:=otlptrace.New(ctx, otlptracegrpc.NewClient())
iferr!=nil {
returnnil, errors.Wrap(err, "failed to create trace exporter")
}
// Register the trace exporter with a TracerProvider, using a batch// span processor to aggregate spans before export.tracerProvider:=sdktrace.NewTracerProvider(
sdktrace.WithSampler(sdktrace.AlwaysSample()),
sdktrace.WithResource(res),
sdktrace.WithBatcher(traceExporter),
)
otel.SetTracerProvider(tracerProvider)
otel.SetTextMapPropagator(
propagation.NewCompositeTextMapPropagator(
propagation.TraceContext{},
propagation.Baggage{},
))
// Shutdown will flush any remaining spans and shut down the exporter.returntracerProvider.Shutdown, nil
}
// MeterProvider an OTLP exporter, and configures the corresponding meter provider.funcMeterProvider(ctx context.Context, res*resource.Resource) (func(context.Context) error, error) {
metricExporter, err:=otlpmetricgrpc.New(ctx)
iferr!=nil {
returnnil, errors.Wrap(err, "failed to create metric exporter")
}
meterProvider:=sdkmetric.NewMeterProvider(
sdkmetric.WithReader(sdkmetric.NewPeriodicReader(metricExporter)),
sdkmetric.WithResource(res),
)
otel.SetMeterProvider(meterProvider)
returnmeterProvider.Shutdown, nil
}
Description
Memory Leak in otel library code.
Environment
- go.opentelemetry.io/contrib/instrumentation/net/http/otelhttp v0.56.0
Steps To Reproduce
See Comment here: #5190 (comment)
Expected behavior
Memory does not continuously increase over time.
The text was updated successfully, but these errors were encountered: