Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LogEmitter System.Diagnostics.Tracing extension project #3305

Closed
wants to merge 53 commits into from
Closed
Show file tree
Hide file tree
Changes from 23 commits
Commits
Show all changes
53 commits
Select commit Hold shift + click to select a range
6d2f153
LogEmitter API.
CodeBlanch May 24, 2022
101ce19
Code review.
CodeBlanch May 25, 2022
7eb0006
Added LogRecordPool.
CodeBlanch May 26, 2022
20479f5
Nullable annotation updates.
CodeBlanch May 26, 2022
c64ca11
Merge branch 'main' into log-emitter
CodeBlanch May 26, 2022
53a6854
Cleanup.
CodeBlanch May 26, 2022
fd5a87e
Cleanup.
CodeBlanch May 26, 2022
035823b
Added reference counting into the log record pool.
CodeBlanch May 26, 2022
80cbdbb
Tweaks.
CodeBlanch May 28, 2022
744c399
Tweak.
CodeBlanch May 29, 2022
df93cf5
Test fix.
CodeBlanch May 29, 2022
1ba8afe
Merge branch 'main' into log-emitter
CodeBlanch May 29, 2022
9a0ee32
Test fix.
CodeBlanch May 29, 2022
888eb54
Rename.
CodeBlanch May 30, 2022
558e8d2
Trigger state buffering by processor inspection.
CodeBlanch May 31, 2022
4d76b87
Implement copy for in-memory log exporter.
CodeBlanch May 31, 2022
617d88d
Added GetDataRef.
CodeBlanch May 31, 2022
ca7e319
Tweaks.
CodeBlanch May 31, 2022
6fae76d
Merge branch 'main' into log-emitter
CodeBlanch May 31, 2022
a92e78e
Revert CompositeProcessor change.
CodeBlanch May 31, 2022
5e33474
Add log stress tests to solution.
CodeBlanch May 31, 2022
6b15b3b
Tweaks.
CodeBlanch May 31, 2022
2114dca
Code review.
CodeBlanch Jun 1, 2022
0d34533
Code review, example app, serilog + eventsource extensions.
CodeBlanch Jun 2, 2022
bb4293c
Rename.
CodeBlanch Jun 2, 2022
380f139
Typo.
CodeBlanch Jun 2, 2022
87597f8
New pool design + tests.
CodeBlanch Jun 6, 2022
345e1a2
Pool selection based on processor.
CodeBlanch Jun 6, 2022
ea63af4
Merge from main.
CodeBlanch Jun 7, 2022
a12d432
Update public api files.
CodeBlanch Jun 7, 2022
3b35268
Public api fix.
CodeBlanch Jun 7, 2022
57775a5
Lint and race comment.
CodeBlanch Jun 7, 2022
9ee18df
Comments in log emitter example app.
CodeBlanch Jun 7, 2022
f347e51
Switch to Volatile.Read.
CodeBlanch Jun 7, 2022
5537c73
Bump Microsoft.DotNet.ApiCompat.
CodeBlanch Jun 7, 2022
e0c2347
Typo fix.
CodeBlanch Jun 7, 2022
dd80cc9
Bump Microsoft.DotNet.ApiCompat.
CodeBlanch Jun 7, 2022
4f6b9e0
Attempting to fix ApiCompat failure.
CodeBlanch Jun 7, 2022
1dcd193
Tweak ApiCompatExcludeAttributeList path.
CodeBlanch Jun 7, 2022
226facf
Exclude NullableContextAttribute from ApiCompat.
CodeBlanch Jun 7, 2022
d3c5443
Merge from main.
CodeBlanch Jun 15, 2022
31f5a97
Merge from main.
CodeBlanch Jun 16, 2022
cb69e49
Fix merge.
CodeBlanch Jun 16, 2022
8693d93
Merge from main.
CodeBlanch Jun 17, 2022
b71006f
Updates.
CodeBlanch Jun 17, 2022
6269015
Merge from main.
CodeBlanch Jun 18, 2022
7647145
Revert OtlpLogExporter use of logRecord.GetDataRef because it is now …
CodeBlanch Jun 18, 2022
4f498af
Merge branch 'main' into log-emitter
cijothomas Jun 27, 2022
55537b5
Merge from main.
CodeBlanch Jun 30, 2022
d1a4b3b
Merge fix.
CodeBlanch Jun 30, 2022
ffda1ce
Merge from main.
CodeBlanch Jul 11, 2022
12c92ad
Merge from main.
CodeBlanch Jul 18, 2022
cf69b8c
Updates.
CodeBlanch Jul 18, 2022
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 7 additions & 1 deletion OpenTelemetry.sln
Original file line number Diff line number Diff line change
Expand Up @@ -233,7 +233,9 @@ Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "OpenTelemetry.Extensions.Pr
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "OpenTelemetry.Extensions.Propagators.Tests", "test\OpenTelemetry.Extensions.Propagators.Tests\OpenTelemetry.Extensions.Propagators.Tests.csproj", "{476D804B-BFEC-4D34-814C-DFFD97109989}"
EndProject
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "correlation", "docs\logs\correlation\correlation.csproj", "{9A07D215-90AC-4BAF-BCDB-73D74FD3A5C5}"
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "correlation", "docs\logs\correlation\correlation.csproj", "{9A07D215-90AC-4BAF-BCDB-73D74FD3A5C5}"
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "OpenTelemetry.Tests.Stress.Logs", "test\OpenTelemetry.Tests.Stress.Logs\OpenTelemetry.Tests.Stress.Logs.csproj", "{5FC0660F-3757-4594-806B-4375E06177A3}"
EndProject
Global
GlobalSection(SolutionConfigurationPlatforms) = preSolution
Expand Down Expand Up @@ -493,6 +495,10 @@ Global
{9A07D215-90AC-4BAF-BCDB-73D74FD3A5C5}.Debug|Any CPU.Build.0 = Debug|Any CPU
{9A07D215-90AC-4BAF-BCDB-73D74FD3A5C5}.Release|Any CPU.ActiveCfg = Release|Any CPU
{9A07D215-90AC-4BAF-BCDB-73D74FD3A5C5}.Release|Any CPU.Build.0 = Release|Any CPU
{5FC0660F-3757-4594-806B-4375E06177A3}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{5FC0660F-3757-4594-806B-4375E06177A3}.Debug|Any CPU.Build.0 = Debug|Any CPU
{5FC0660F-3757-4594-806B-4375E06177A3}.Release|Any CPU.ActiveCfg = Release|Any CPU
{5FC0660F-3757-4594-806B-4375E06177A3}.Release|Any CPU.Build.0 = Release|Any CPU
EndGlobalSection
GlobalSection(SolutionProperties) = preSolution
HideSolutionNode = FALSE
Expand Down
24 changes: 22 additions & 2 deletions examples/AspNetCore/Controllers/WeatherForecastController.cs
Original file line number Diff line number Diff line change
Expand Up @@ -16,24 +16,30 @@

namespace Examples.AspNetCore.Controllers;

using System.Diagnostics;
using Microsoft.AspNetCore.Mvc;
using OpenTelemetry.Logs;

[ApiController]
[Route("[controller]")]
public class WeatherForecastController : ControllerBase
{
private static readonly string[] Summaries = new[]
{
"Freezing", "Bracing", "Chilly", "Cool", "Mild", "Warm", "Balmy", "Hot", "Sweltering", "Scorching",
"Freezing", "Bracing", "Chilly", "Cool", "Mild", "Warm", "Balmy", "Hot", "Sweltering", "Scorching",
};

private static readonly HttpClient HttpClient = new();

private readonly ILogger<WeatherForecastController> logger;
private readonly LogEmitter logEmitter;

public WeatherForecastController(ILogger<WeatherForecastController> logger)
public WeatherForecastController(
ILogger<WeatherForecastController> logger,
LogEmitter logEmitter)
{
this.logger = logger ?? throw new ArgumentNullException(nameof(logger));
this.logEmitter = logEmitter ?? throw new ArgumentNullException(nameof(logEmitter));
}

[HttpGet]
Expand All @@ -54,11 +60,25 @@ public IEnumerable<WeatherForecast> Get()
})
.ToArray();

// Log using ILogger API.
this.logger.LogInformation(
"WeatherForecasts generated {count}: {forecasts}",
forecast.Length,
forecast);

// Log using LogEmitter API.
this.logEmitter.Log(
alanwest marked this conversation as resolved.
Show resolved Hide resolved
new LogRecordData(Activity.Current)
{
CategoryName = "WeatherForecasts",
LogLevel = LogLevel.Information,
Message = "WeatherForecasts generated.",
},
new LogRecordAttributeList()
{
["count"] = forecast.Length,
});

return forecast;
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,11 @@ public override ExportResult Export(in Batch<LogRecord> batch)
this.WriteLine($"{"LogRecord.TraceFlags:",-RightPaddingLength}{logRecord.TraceFlags}");
}

this.WriteLine($"{"LogRecord.CategoryName:",-RightPaddingLength}{logRecord.CategoryName}");
if (logRecord.CategoryName != null)
{
this.WriteLine($"{"LogRecord.CategoryName:",-RightPaddingLength}{logRecord.CategoryName}");
}

this.WriteLine($"{"LogRecord.LogLevel:",-RightPaddingLength}{logRecord.LogLevel}");

if (logRecord.FormattedMessage != null)
Expand Down
9 changes: 5 additions & 4 deletions src/OpenTelemetry.Exporter.InMemory/InMemoryExporter.cs
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,6 @@
// limitations under the License.
// </copyright>

using System;
using System.Collections.Generic;

namespace OpenTelemetry.Exporter
Expand All @@ -23,19 +22,21 @@ public class InMemoryExporter<T> : BaseExporter<T>
where T : class
{
private readonly ICollection<T> exportedItems;
private readonly Func<Batch<T>, ExportResult> onExport;
private readonly ExportFunc onExport;

public InMemoryExporter(ICollection<T> exportedItems)
{
this.exportedItems = exportedItems;
this.onExport = (Batch<T> batch) => this.DefaultExport(batch);
this.onExport = (in Batch<T> batch) => this.DefaultExport(in batch);
}

internal InMemoryExporter(Func<Batch<T>, ExportResult> exportFunc)
internal InMemoryExporter(ExportFunc exportFunc)
{
this.onExport = exportFunc;
}

internal delegate ExportResult ExportFunc(in Batch<T> batch);

public override ExportResult Export(in Batch<T> batch) => this.onExport(batch);

private ExportResult DefaultExport(in Batch<T> batch)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,35 @@ public static OpenTelemetryLoggerOptions AddInMemoryExporter(this OpenTelemetryL
Guard.ThrowIfNull(loggerOptions);
Guard.ThrowIfNull(exportedItems);

return loggerOptions.AddProcessor(new SimpleLogRecordExportProcessor(new InMemoryExporter<LogRecord>(exportedItems)));
var logExporter = new InMemoryExporter<LogRecord>(
exportFunc: (in Batch<LogRecord> batch) => ExportLogRecord(in batch, exportedItems));

return loggerOptions.AddProcessor(new SimpleLogRecordExportProcessor(logExporter));
}

private static ExportResult ExportLogRecord(in Batch<LogRecord> batch, ICollection<LogRecord> exportedItems)
{
if (exportedItems == null)
{
return ExportResult.Failure;
}

foreach (var log in batch)
{
log.Buffer();

LogRecord copy = new()
{
Data = log.Data,
State = log.State,
StateValues = log.StateValues == null ? null : new List<KeyValuePair<string, object>>(log.StateValues),
BufferedScopes = log.BufferedScopes == null ? null : new List<object>(log.BufferedScopes),
};

exportedItems.Add(copy);
}

return ExportResult.Success;
}
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -145,7 +145,7 @@ private static MeterProviderBuilder AddInMemoryExporter(
configureMetricReader?.Invoke(metricReaderOptions);

var metricExporter = new InMemoryExporter<Metric>(
exportFunc: metricBatch => ExportMetricSnapshot(metricBatch, exportedItems));
exportFunc: (in Batch<Metric> metricBatch) => ExportMetricSnapshot(in metricBatch, exportedItems));

var metricReader = PeriodicExportingMetricReaderHelper.CreatePeriodicExportingMetricReader(
metricExporter,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -65,30 +65,31 @@ internal static void AddBatch(
internal static OtlpLogs.LogRecord ToOtlpLog(this LogRecord logRecord)
{
OtlpLogs.LogRecord otlpLogRecord = null;
ref LogRecordData data = ref logRecord.GetDataRef();
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just confirming my own understanding of the improvements this PR brings... so this is more performant because accessing properties of the struct results in a call vs. callvirt, right?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This particular change is more to prevent a perf regression! This method used to work something like...

ldarg.1 -> call LogRecord.CategoryName.Get() -> ldfld CategoryName_Backing -> ret

(Because LogRecord is sealed it should already be a call instead of callvirt.)

Now due to the pooling changes it has to do...

ldarg.1 -> call LogRecord.CategoryName.Get() -> ldflda this.logRecordData -> call LogRecordData.CategoryName.Get() -> ldfld CategoryName_Backing -> ret

Extra call!

Using GetDataRef() it goes back to a single call like...

ldflda [address of ref] -> call LogRecordData.CategoryName.Get()-> ldfld CategoryName_Backing -> ret

Just an optimization to cut out the extra hops.

True perf really comes down to how the JIT decides to inline things. Here's a sharplab if you want to mess with it.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So what are the real perf gains on this PR?

I will attempt to answer this 😄

Our log logic was like this:

                var record = new LogRecord(
                    provider.IncludeScopes ? this.ScopeProvider : null,
                    DateTime.UtcNow,
                    this.categoryName,
                    logLevel,
                    eventId,
                    provider.IncludeFormattedMessage ? formatter?.Invoke(state, exception) : null,
                    provider.ParseStateValues ? null : state,
                    exception,
                    provider.ParseStateValues ? this.ParseState(state) : null);

                processor.OnEnd(record);

                record.ScopeProvider = null;

And is now like this:

                var record = LogRecordPool.Rent();

                record.ScopeProvider = provider.IncludeScopes ? this.ScopeProvider : null;
                record.State = provider.ParseStateValues ? null : state;
                record.StateValues = provider.ParseStateValues ? this.ParseState(record, state) : null;

                ref LogRecordData data = ref record.Data;

                data.TimestampBacking = DateTime.UtcNow;
                data.CategoryName = this.categoryName;
                data.LogLevel = logLevel;
                data.EventId = eventId;
                data.Message = provider.IncludeFormattedMessage ? formatter?.Invoke(state, exception) : null;
                data.Exception = exception;

                LogRecordData.SetActivityContext(ref data, Activity.Current);

                processor.OnEnd(record);

                record.ScopeProvider = null;

                // Attempt to return the LogRecord to the pool. This will no-op
                // if a batch exporter has added a reference.
                LogRecordPool.Return(record);

Should be slower, right?

Turns out, is only slightly slower (when hitting the [ThreadStatic]):

Method Mean Error StdDev Gen 0 Allocated
LogUsingCtor 38.89 ns 0.154 ns 0.137 ns 0.0102 128 B
LogUsingPool 40.37 ns 0.054 ns 0.048 ns - -

But we eliminated all the memory pressure. I think this is why the stress test is performing better with the pool. Only a few more CPU cycles to use the pool but the GC is asleep freeing up the CPU to process more logs. This is a largely a guess but kind of makes sense?

Now that benchmark is interesting. How is it introducing the pool logic made it only slightly slower than just calling ctor?

Check out this benchmark:

    public class CtorBenchmarks
    {
        [Benchmark]
        public TestSmallClass CtorSmallClass()
        {
            return new();
        }

        [Benchmark]
        public TestLargeClass CtorLargeClass()
        {
            return new();
        }

        public class TestSmallClass
        {
            public string? Prop1;
        }

        public class TestLargeClass
        {
            public string? Prop1;
            public string? Prop2;
            public string? Prop3;
            public int Prop4;
            public int Prop5;
            public int Prop6;
            public DateTime Prop7;
            public object? Prop8;
            public object? Prop9;
            public object? Prop10;
            public string? Prop11;
            public string? Prop12;
            public string? Prop13;
        }
    }
Method Mean Error StdDev Gen 0 Allocated
CtorSmallClass 1.884 ns 0.0236 ns 0.0209 ns 0.0019 24 B
CtorLargeClass 3.982 ns 0.0298 ns 0.0279 ns 0.0089 112 B

Number of fields on the class impacts the ctor time. LogRecord has a lot of fields so that gives us some wiggle room to execute our pool logic before just calling the ctor would be faster. Interesting, eh?


try
{
otlpLogRecord = new OtlpLogs.LogRecord
{
TimeUnixNano = (ulong)logRecord.Timestamp.ToUnixTimeNanoseconds(),
SeverityNumber = GetSeverityNumber(logRecord.LogLevel),
SeverityText = LogLevels[(int)logRecord.LogLevel],
TimeUnixNano = (ulong)data.Timestamp.ToUnixTimeNanoseconds(),
SeverityNumber = GetSeverityNumber(data.LogLevel),
SeverityText = LogLevels[(int)data.LogLevel],
};

if (!string.IsNullOrEmpty(logRecord.CategoryName))
if (!string.IsNullOrEmpty(data.CategoryName))
{
// TODO:
// 1. Track the following issue, and map CategoryName to Name
// if it makes it to log data model.
// https://github.com/open-telemetry/opentelemetry-specification/issues/2398
// 2. Confirm if this name for attribute is good.
otlpLogRecord.Attributes.AddStringAttribute("dotnet.ilogger.category", logRecord.CategoryName);
otlpLogRecord.Attributes.AddStringAttribute("dotnet.ilogger.category", data.CategoryName);
}

bool bodyPopulatedFromFormattedMessage = false;
if (logRecord.FormattedMessage != null)
if (data.Message != null)
{
otlpLogRecord.Body = new OtlpCommon.AnyValue { StringValue = logRecord.FormattedMessage };
otlpLogRecord.Body = new OtlpCommon.AnyValue { StringValue = data.Message };
bodyPopulatedFromFormattedMessage = true;
}

Expand All @@ -110,34 +111,34 @@ internal static OtlpLogs.LogRecord ToOtlpLog(this LogRecord logRecord)
}
}

if (logRecord.EventId.Id != default)
if (data.EventId.Id != default)
{
otlpLogRecord.Attributes.AddIntAttribute(nameof(logRecord.EventId.Id), logRecord.EventId.Id);
otlpLogRecord.Attributes.AddIntAttribute(nameof(data.EventId.Id), data.EventId.Id);
}

if (!string.IsNullOrEmpty(logRecord.EventId.Name))
if (!string.IsNullOrEmpty(data.EventId.Name))
{
otlpLogRecord.Attributes.AddStringAttribute(nameof(logRecord.EventId.Name), logRecord.EventId.Name);
otlpLogRecord.Attributes.AddStringAttribute(nameof(data.EventId.Name), data.EventId.Name);
}

if (logRecord.Exception != null)
if (data.Exception != null)
{
otlpLogRecord.Attributes.AddStringAttribute(SemanticConventions.AttributeExceptionType, logRecord.Exception.GetType().Name);
otlpLogRecord.Attributes.AddStringAttribute(SemanticConventions.AttributeExceptionMessage, logRecord.Exception.Message);
otlpLogRecord.Attributes.AddStringAttribute(SemanticConventions.AttributeExceptionStacktrace, logRecord.Exception.ToInvariantString());
otlpLogRecord.Attributes.AddStringAttribute(SemanticConventions.AttributeExceptionType, data.Exception.GetType().Name);
otlpLogRecord.Attributes.AddStringAttribute(SemanticConventions.AttributeExceptionMessage, data.Exception.Message);
otlpLogRecord.Attributes.AddStringAttribute(SemanticConventions.AttributeExceptionStacktrace, data.Exception.ToInvariantString());
}

if (logRecord.TraceId != default && logRecord.SpanId != default)
if (data.TraceId != default && data.SpanId != default)
{
byte[] traceIdBytes = new byte[16];
byte[] spanIdBytes = new byte[8];

logRecord.TraceId.CopyTo(traceIdBytes);
logRecord.SpanId.CopyTo(spanIdBytes);
data.TraceId.CopyTo(traceIdBytes);
data.SpanId.CopyTo(spanIdBytes);

otlpLogRecord.TraceId = UnsafeByteOperations.UnsafeWrap(traceIdBytes);
otlpLogRecord.SpanId = UnsafeByteOperations.UnsafeWrap(spanIdBytes);
otlpLogRecord.Flags = (uint)logRecord.TraceFlags;
otlpLogRecord.Flags = (uint)data.TraceFlags;
}

int scopeDepth = -1;
Expand Down
24 changes: 12 additions & 12 deletions src/OpenTelemetry/.publicApi/net462/PublicAPI.Shipped.txt
Original file line number Diff line number Diff line change
Expand Up @@ -17,22 +17,22 @@ OpenTelemetry.BaseProcessor<T>.Dispose() -> void
OpenTelemetry.BaseProcessor<T>.ForceFlush(int timeoutMilliseconds = -1) -> bool
~OpenTelemetry.BaseProcessor<T>.ParentProvider.get -> OpenTelemetry.BaseProvider
OpenTelemetry.BaseProcessor<T>.Shutdown(int timeoutMilliseconds = -1) -> bool
~OpenTelemetry.Batch<T>
OpenTelemetry.Batch<T>
OpenTelemetry.Batch<T>.Batch() -> void
~OpenTelemetry.Batch<T>.Batch(T[] items, int count) -> void
OpenTelemetry.Batch<T>.Batch(T![]! items, int count) -> void
OpenTelemetry.Batch<T>.Count.get -> long
OpenTelemetry.Batch<T>.Dispose() -> void
OpenTelemetry.Batch<T>.Enumerator
~OpenTelemetry.Batch<T>.Enumerator.Current.get -> T
OpenTelemetry.Batch<T>.Enumerator.Current.get -> T!
OpenTelemetry.Batch<T>.Enumerator.Dispose() -> void
OpenTelemetry.Batch<T>.Enumerator.Enumerator() -> void
OpenTelemetry.Batch<T>.Enumerator.MoveNext() -> bool
OpenTelemetry.Batch<T>.Enumerator.Reset() -> void
~OpenTelemetry.Batch<T>.GetEnumerator() -> OpenTelemetry.Batch<T>.Enumerator
OpenTelemetry.BatchActivityExportProcessor
~OpenTelemetry.BatchActivityExportProcessor.BatchActivityExportProcessor(OpenTelemetry.BaseExporter<System.Diagnostics.Activity> exporter, int maxQueueSize = 2048, int scheduledDelayMilliseconds = 5000, int exporterTimeoutMilliseconds = 30000, int maxExportBatchSize = 512) -> void
~OpenTelemetry.BatchExportProcessor<T>
~OpenTelemetry.BatchExportProcessor<T>.BatchExportProcessor(OpenTelemetry.BaseExporter<T> exporter, int maxQueueSize = 2048, int scheduledDelayMilliseconds = 5000, int exporterTimeoutMilliseconds = 30000, int maxExportBatchSize = 512) -> void
OpenTelemetry.BatchExportProcessor<T>
OpenTelemetry.BatchExportProcessor<T>.BatchExportProcessor(OpenTelemetry.BaseExporter<T!>! exporter, int maxQueueSize = 2048, int scheduledDelayMilliseconds = 5000, int exporterTimeoutMilliseconds = 30000, int maxExportBatchSize = 512) -> void
~OpenTelemetry.BatchExportProcessorOptions<T>
OpenTelemetry.BatchExportProcessorOptions<T>.BatchExportProcessorOptions() -> void
OpenTelemetry.BatchExportProcessorOptions<T>.ExporterTimeoutMilliseconds.get -> int
Expand All @@ -46,16 +46,16 @@ OpenTelemetry.BatchExportProcessorOptions<T>.ScheduledDelayMilliseconds.set -> v
OpenTelemetry.BatchLogRecordExportProcessor
OpenTelemetry.BatchLogRecordExportProcessor.BatchLogRecordExportProcessor(OpenTelemetry.BaseExporter<OpenTelemetry.Logs.LogRecord!>! exporter, int maxQueueSize = 2048, int scheduledDelayMilliseconds = 5000, int exporterTimeoutMilliseconds = 30000, int maxExportBatchSize = 512) -> void
OpenTelemetry.CompositeProcessor<T>
~OpenTelemetry.CompositeProcessor<T>.AddProcessor(OpenTelemetry.BaseProcessor<T> processor) -> OpenTelemetry.CompositeProcessor<T>
~OpenTelemetry.CompositeProcessor<T>.CompositeProcessor(System.Collections.Generic.IEnumerable<OpenTelemetry.BaseProcessor<T>> processors) -> void
OpenTelemetry.CompositeProcessor<T>.AddProcessor(OpenTelemetry.BaseProcessor<T>! processor) -> OpenTelemetry.CompositeProcessor<T>!
OpenTelemetry.CompositeProcessor<T>.CompositeProcessor(System.Collections.Generic.IEnumerable<OpenTelemetry.BaseProcessor<T>!>! processors) -> void
OpenTelemetry.ExportProcessorType
OpenTelemetry.ExportProcessorType.Batch = 1 -> OpenTelemetry.ExportProcessorType
OpenTelemetry.ExportProcessorType.Simple = 0 -> OpenTelemetry.ExportProcessorType
OpenTelemetry.ExportResult
OpenTelemetry.ExportResult.Failure = 1 -> OpenTelemetry.ExportResult
OpenTelemetry.ExportResult.Success = 0 -> OpenTelemetry.ExportResult
OpenTelemetry.Logs.LogRecord
OpenTelemetry.Logs.LogRecord.CategoryName.get -> string!
OpenTelemetry.Logs.LogRecord.CategoryName.get -> string?
OpenTelemetry.Logs.LogRecord.EventId.get -> Microsoft.Extensions.Logging.EventId
OpenTelemetry.Logs.LogRecord.Exception.get -> System.Exception?
OpenTelemetry.Logs.LogRecord.ForEachScope<TState>(System.Action<OpenTelemetry.Logs.LogRecordScope, TState>! callback, TState state) -> void
Expand Down Expand Up @@ -216,8 +216,8 @@ OpenTelemetry.Resources.ResourceBuilderExtensions
OpenTelemetry.Sdk
OpenTelemetry.SimpleActivityExportProcessor
~OpenTelemetry.SimpleActivityExportProcessor.SimpleActivityExportProcessor(OpenTelemetry.BaseExporter<System.Diagnostics.Activity> exporter) -> void
~OpenTelemetry.SimpleExportProcessor<T>
~OpenTelemetry.SimpleExportProcessor<T>.SimpleExportProcessor(OpenTelemetry.BaseExporter<T> exporter) -> void
OpenTelemetry.SimpleExportProcessor<T>
OpenTelemetry.SimpleExportProcessor<T>.SimpleExportProcessor(OpenTelemetry.BaseExporter<T!>! exporter) -> void
OpenTelemetry.SimpleLogRecordExportProcessor
OpenTelemetry.SimpleLogRecordExportProcessor.SimpleLogRecordExportProcessor(OpenTelemetry.BaseExporter<OpenTelemetry.Logs.LogRecord!>! exporter) -> void
OpenTelemetry.SuppressInstrumentationScope
Expand Down Expand Up @@ -270,7 +270,7 @@ override OpenTelemetry.BaseExportProcessor<T>.OnForceFlush(int timeoutMillisecon
override OpenTelemetry.BaseExportProcessor<T>.OnShutdown(int timeoutMilliseconds) -> bool
~override OpenTelemetry.BatchActivityExportProcessor.OnEnd(System.Diagnostics.Activity data) -> void
override OpenTelemetry.BatchExportProcessor<T>.Dispose(bool disposing) -> void
~override OpenTelemetry.BatchExportProcessor<T>.OnExport(T data) -> void
override OpenTelemetry.BatchExportProcessor<T>.OnExport(T! data) -> void
override OpenTelemetry.BatchExportProcessor<T>.OnForceFlush(int timeoutMilliseconds) -> bool
override OpenTelemetry.BatchExportProcessor<T>.OnShutdown(int timeoutMilliseconds) -> bool
override OpenTelemetry.BatchLogRecordExportProcessor.OnEnd(OpenTelemetry.Logs.LogRecord! data) -> void
Expand All @@ -288,7 +288,7 @@ override OpenTelemetry.Metrics.BaseExportingMetricReader.OnShutdown(int timeoutM
override OpenTelemetry.Metrics.PeriodicExportingMetricReader.Dispose(bool disposing) -> void
override OpenTelemetry.Metrics.PeriodicExportingMetricReader.OnShutdown(int timeoutMilliseconds) -> bool
~override OpenTelemetry.SimpleActivityExportProcessor.OnEnd(System.Diagnostics.Activity data) -> void
~override OpenTelemetry.SimpleExportProcessor<T>.OnExport(T data) -> void
override OpenTelemetry.SimpleExportProcessor<T>.OnExport(T! data) -> void
override OpenTelemetry.Trace.AlwaysOffSampler.ShouldSample(in OpenTelemetry.Trace.SamplingParameters samplingParameters) -> OpenTelemetry.Trace.SamplingResult
override OpenTelemetry.Trace.AlwaysOnSampler.ShouldSample(in OpenTelemetry.Trace.SamplingParameters samplingParameters) -> OpenTelemetry.Trace.SamplingResult
override OpenTelemetry.Trace.ParentBasedSampler.ShouldSample(in OpenTelemetry.Trace.SamplingParameters samplingParameters) -> OpenTelemetry.Trace.SamplingResult
Expand Down
Loading