Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

7000 Failed AcceptMessageSession dependencies in AppInsight per hour #6813

Closed
Mortana89 opened this issue Jul 3, 2019 · 17 comments
Closed
Assignees
Labels
bug This issue requires a change to an existing behavior in the product in order to be resolved. Client This issue points to a problem in the data-plane of the library. customer-reported Issues that are reported by GitHub users external to the Azure organization. Service Attention Workflow: This issue is responsible by Azure service team. Service Bus
Milestone

Comments

@Mortana89
Copy link

Mortana89 commented Jul 3, 2019

Crossposting original bug from https://github.com/Azure/azure-service-bus-dotnet/issues/588

Actual Behavior
Ensure AppInisight is configured and dependency tracking is enabled.
Construct a SubscriptionClient for a topic with sessions enabled.
Call RegisterSessionHandler on the client.
(When nothing is published on the topic): every hour 7000 "AcceptMessageSession" dependencies with "Dependency call status" as False are logged in AppInsight.
Expected Behavior
When nothing is published there should not be anything logged in AppInsight. Or at least it should not log a failed dependency.
Versions
OS platform and version: Windows 10 Pro, 1803
.NET Version: .NET 4.6
NuGet package version or commit ID:
Comments:
Using RegisterMessageHandler instead of RegisterSessionHandler also causes a dependency "Receive" to be logged every 1 minute, but this has a status of True, which is much less noisy in AppInsight.

  • Credits to @oletolshave, changed a bit to reflect our situation

We're facing the same issue for a while now, and it's generating roughly 7k (!) false exceptions in application insights PER hour. This not only makes it difficult to find relevant exceptions in AI, it also consumes a lot of storage space compared to our other logs.

@triage-new-issues triage-new-issues bot added the needs-triage Workflow: This is a new issue that needs to be triaged to the appropriate team. label Jul 3, 2019
@maggiepint maggiepint added Client This issue points to a problem in the data-plane of the library. customer-reported Issues that are reported by GitHub users external to the Azure organization. Service Attention Workflow: This issue is responsible by Azure service team. Service Bus labels Jul 3, 2019
@triage-new-issues triage-new-issues bot removed needs-triage Workflow: This is a new issue that needs to be triaged to the appropriate team. labels Jul 3, 2019
@maggiepint
Copy link
Contributor

Thanks for bringing this issue to our attention. I have routed to the appropriate team for follow-up.

@jfggdl jfggdl assigned binzywu, sjkwak and nemakam and unassigned sjkwak Jul 8, 2019
@jfggdl
Copy link

jfggdl commented Jul 8, 2019

@Mortana89, thank you for bringing this issue to our attention...again. An engineer from our team should be looking at this issue soon.

@Mortana89
Copy link
Author

Thanks for fixing this! When will this be available in a new package @nemakam ?

@nemakam
Copy link
Contributor

nemakam commented Jul 23, 2019

I don't have an ETA right now, but will try to get it out in a week or two. There are a couple of more things that needs to get in for the next release.

@Mortana89
Copy link
Author

Hi @nemakam, do you have an update when this is available?

@nemakam
Copy link
Contributor

nemakam commented Aug 8, 2019

Just published the new version. https://www.nuget.org/packages/Microsoft.Azure.ServiceBus/4.0.0

@vigneshmsft
Copy link

vigneshmsft commented Aug 29, 2019

@nemakam i have updated to the latest version, but still get these failures in AppInsights.
This is from an aspnetcore 2.2 app service running in Azure App Services.

@Mortana89
Copy link
Author

Same here...

@nemakam
Copy link
Contributor

nemakam commented Aug 29, 2019

Interesting. What's the exception that you see? Timeout exception?

@nemakam nemakam reopened this Aug 29, 2019
@vigneshmsft
Copy link

vigneshmsft commented Aug 29, 2019

don't see it as an application exception, but it shows up as a dependency failure on the service bus topic with an "undefined" response code.
image

image

@jfggdl jfggdl modified the milestones: Sprint 157, Sprint 158 Sep 4, 2019
@jfggdl jfggdl modified the milestones: Sprint 158, Sprint 159 Sep 16, 2019
@jfggdl
Copy link

jfggdl commented Oct 2, 2019

@lmolkova, would you please help us fix this issue? Thanks.

@lmolkova
Copy link
Member

lmolkova commented Oct 2, 2019

This looks like call to ServiceBus in an attempt to accept message session but it ends with timeout (60s) and does not do much.

The only way to mitigate it, for now, is to implement and configure custom telemetry processor that would filter such calls out.

I created issue in AppInsights repo to track the fix https://github.com/microsoft/ApplicationInsights-dotnet-server/issues/1281

@lmolkova lmolkova removed their assignment Oct 2, 2019
@Mortana89
Copy link
Author

This costs us a lot of AI data. When can we expect a fix for this?

@lmolkova
Copy link
Member

lmolkova commented Oct 2, 2019

This needs to go through the triage and planning process before we can commit to certain date. For now, please try workaround with telemetry processor

@jfggdl
Copy link

jfggdl commented Oct 2, 2019

@lmolkova, in the interest of setting the right expectations, would you please share an approximate month in the future (even beyond your current planning period if that is the right answer) when a permanent solution for this issue would be provided.

@lmolkova
Copy link
Member

lmolkova commented Oct 2, 2019

I can do this after we triage and plan this, but next stable version will happen in ~3 months and this is the earliest when fix could be delivered. Beta versions might get fix earlier. Please follow the AppInsights issue microsoft/ApplicationInsights-dotnet-server#1281 - we'll set milestone there when we'll do the planning.

@jfggdl jfggdl modified the milestones: Sprint 159, Sprint 165 Dec 2, 2019
@nemakam nemakam added the bug This issue requires a change to an existing behavior in the product in order to be resolved. label Dec 11, 2019
@nemakam
Copy link
Contributor

nemakam commented Jan 31, 2020

Closing this issue here. Kindly track this issue through - microsoft/ApplicationInsights-dotnet#1348

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
bug This issue requires a change to an existing behavior in the product in order to be resolved. Client This issue points to a problem in the data-plane of the library. customer-reported Issues that are reported by GitHub users external to the Azure organization. Service Attention Workflow: This issue is responsible by Azure service team. Service Bus
Projects
None yet
Development

No branches or pull requests

8 participants