Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Do not report ServiceBus receive and AcceptMessageSession calls that result in timeout #1348

Open
lmolkova opened this issue Oct 2, 2019 · 17 comments

Comments

@lmolkova
Copy link
Member

lmolkova commented Oct 2, 2019

Azure/azure-sdk-for-net#6813
microsoft/ApplicationInsights-dotnet-server#1259

There is not much value in reporting certain calls from ServiceBus. They could be empty receive calls that end up in timeout (no messages on the Bus).

We need to review calls we collect and see which of them are useful.

@lmolkova lmolkova self-assigned this Oct 3, 2019
@TimothyMothra TimothyMothra transferred this issue from microsoft/ApplicationInsights-dotnet-server Dec 4, 2019
@lukasvosyka
Copy link

Any news here? When is a fix to be expected. It bloats the AI with logs over and over with Dependency call failures.

@lukasvosyka
Copy link

Hi,

we are trying to get rid of the "AcceptMessageSession" errors when using Service bus client and a SerivceBusTimeoutException happens. As I understand it is "not an issue" since it recovers internally, but how do I get access to the very underlying exception during ITelemetryProcessor?

In VS I do see in the Debugger / Local apparentely the $exception listed, which seems to be the underlying exception, but don't know how to use the pipeline to access it in a proper way..

This is my current approach though. Any hint is highly appreciated :) @lmolkova

public class SkipAcceptMessageSessionDependencyFailuresTelemetryProcessor : ITelemetryProcessor
{
	private ITelemetryProcessor Next { get; set; }

	// next will point to the next TelemetryProcessor in the chain.
	public SkipAcceptMessageSessionDependencyFailuresTelemetryProcessor(ITelemetryProcessor next)
	{
		this.Next = next;
	}

	public void Process(ITelemetry item)
	{
		// To filter out an item, return without calling the next processor.
		if (FilterTelemetry(item)) { return; }

		this.Next.Process(item);
	}

	private bool FilterTelemetry(ITelemetry item)
	{
		var dependency = item as DependencyTelemetry;
		if (dependency == null) return false;

		return dependency.Type == "Azure Service Bus" && dependency.Name == "AcceptMessageSession" && dependency.Success != true;
	}
}

@kevine323
Copy link

Any update here?

@cijothomas
Copy link
Contributor

Sorry, don't have any updates on this.

@IT-CASADO
Copy link

Today we noticed also thousands of failed 'AcceptMessageSession' logs under 'Dependencies' tab.
This is really annoying...

@kiseln
Copy link

kiseln commented Apr 21, 2021

I think we're having the same issue. We are using ServiceBusProcessor to receive messages and a default app insights configuration with AddApplicationInsightsTelemetryWorkerService(). The result is below on a screenshot. Would be great to get rid of these log entries without any additional filtering code

=================================

image

@kevine323
Copy link

This has been ongoing for quite some time. Any update on a fix?

@s-krawczyk
Copy link

I also experience this error. Is there any plan to fix it?

@Marusyk
Copy link

Marusyk commented Jul 7, 2021

I have the same issue

@ben-burton
Copy link

I think we're having the same issue. We are using ServiceBusProcessor to receive messages and a default app insights configuration with AddApplicationInsightsTelemetryWorkerService(). The result is below on a screenshot. Would be great to get rid of these log entries without any additional filtering code

=================================

image

I was going to raise an issue relating to what I can see in this screenshot. Shouldn't this telemetry at least be sent as a dependency rather than a request? As a work around I'm currently filtering out ServiceBusReceiver.Receive using a telemetry processor as it's bloating the logs

@esbenbach
Copy link

@ben-burton any chance you could share that telemtry processor, because its annoying as hell and we also want to filter it out.

@ben-burton
Copy link

@esbenbach sure here is my ITelemetryProcessor

namespace ExampleNamespace
{
    using System.Collections.Generic;
    using Microsoft.ApplicationInsights.Channel;
    using Microsoft.ApplicationInsights.Extensibility;
    using Microsoft.ApplicationInsights.Extensibility.Implementation;

    public class TelemetryFilterProcessor : ITelemetryProcessor
    {
        private readonly ITelemetryProcessor _next;
        private readonly HashSet<string> _excludedTelemetryNames;

        public TelemetryFilterProcessor(ITelemetryProcessor next, HashSet<string> excludedTelemetryNames = null)
        {
            _next = next;
            _excludedTelemetryNames = excludedTelemetryNames ?? new HashSet<string>();
        }

        public void Process(ITelemetry item)
        {
            if (OkToSend(item))
            {
                _next.Process(item);
            }
        }

        private bool OkToSend(ITelemetry item)
        {
            if (!(item is OperationTelemetry operationTelemetry))
            {
                return true;
            }

            return !_excludedTelemetryNames.Contains(operationTelemetry.Name);
        }
    }
}

and how it could be used

var builder = TelemetryConfiguration.CreateDefault().TelemetryProcessorChainBuilder;

builder.Use(next => new TelemetryFilterProcessor(next, 
    new HashSet<string>
    {
        "ServiceBusReceiver.Receive"
    }));
builder.Build();

If you want to use the services.AddApplicationInsightsTelemetryProcessor<> way of configuring App Insights then see this link for ideas how to use DI to pass in the excludedTelemetryNames: #1563 (comment)

@github-actions
Copy link

This issue is stale because it has been open 300 days with no activity. Remove stale label or this will be closed in 7 days. Commenting will instruct the bot to automatically remove the label.

@github-actions github-actions bot added the stale label Jul 13, 2022
@esbenbach
Copy link

esbenbach commented Jul 13, 2022 via email

@github-actions github-actions bot removed the stale label Jul 14, 2022
@vikneshrajspui
Copy link

vikneshrajspui commented Dec 22, 2022

I am also facing the same issue, TelemetryProcessor is not getting added in workerservice #2726.
Appreciate any suggestions

@remcoros
Copy link

remcoros commented Mar 1, 2023

Any update?

1 similar comment
@Gustavo-Marques19
Copy link

Any update?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests