Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] ServiceBusTrigger - lock renewals stop suddenly/randomly #37414

Closed
pzaj2 opened this issue Jul 5, 2023 · 20 comments
Closed

[BUG] ServiceBusTrigger - lock renewals stop suddenly/randomly #37414

pzaj2 opened this issue Jul 5, 2023 · 20 comments
Assignees
Labels
Client This issue points to a problem in the data-plane of the library. customer-reported Issues that are reported by GitHub users external to the Azure organization. needs-team-attention Workflow: This issue needs attention from Azure service team or SDK team question The issue doesn't require a change to the product in order to be resolved. Most issues start as that Service Bus

Comments

@pzaj2
Copy link

pzaj2 commented Jul 5, 2023

Library name and version

Microsoft.Azure.WebJobs.Extensions.ServiceBus 5.11.0

Describe the bug

I have a WebJobs project with continuous jobs and Service Bus trigger (topic). The trigger/function runs for quite some time (up to 40 minutes) and after some time, the calls to renew the lock on the message (PeekLock mode) stop being made. This then leads to failures when the function auto-completes (attempts to) the message. MaxAutoLockRenewalDuration is not exceeded as per the project attached.

This is especially hard to deal with because all the work carried out by my function completes but the message goes back to the queue and is re-processed in a retries loop. You can see the output from AppInsights (in Rider) in the screenshot below.

image

Expected behavior

The message lock should be renewed. Auto-complete should successfully complete the message.

Actual behavior

The message lock is not renewed. Auto-complete fails to complete the message.

Reproduction Steps

You'll need to set up Service Bus in appsettings.json and create the required messaging entities (check TestTrigger.cs). It takes a while - I've seen this 5-6 times in the last 2 days when running this project locally.
ServiceBusLockLostRepro.zip

Environment

Hosting: Azure AppService (but the same can be reproduced locally, running against the below

.NET SDK:
 Version:   7.0.203
 Commit:    5b005c19f5

Runtime Environment:
 OS Name:     Windows
 OS Version:  10.0.19044
 OS Platform: Windows
 RID:         win10-x64
 Base Path:   C:\Program Files\dotnet\sdk\7.0.203\

Host:
  Version:      7.0.5
  Architecture: x64
  Commit:       8042d61b17

The issue is IDE-agnostic. I ran it from within Rider as well as using CLI.

@github-actions github-actions bot added Client This issue points to a problem in the data-plane of the library. customer-reported Issues that are reported by GitHub users external to the Azure organization. needs-team-triage Workflow: This issue needs the team to triage. question The issue doesn't require a change to the product in order to be resolved. Most issues start as that Service Bus labels Jul 5, 2023
@jsquire jsquire self-assigned this Jul 6, 2023
@jsquire
Copy link
Member

jsquire commented Jul 6, 2023

Hi @pzaj2. Thank you for reaching out and we regret that you're experiencing difficulties. Beyond the exception, is there any reason that you suspect that lock renewal has stopped taking place? I suspect that you're seeing this error intermittently because of a transient network failure that requires the AMQP link to be recreated. Messages must be settled on the link from which they were received and having a link be recreated would manifest in a lock lost error, despite the lock being renewed and in a valid state.

To confirm, we'll need to take a look at some additional SDK logs in a +/- 5-minute window around the time you're seeing the exception. Specifically, we're looking for:

  • Any Error logs
  • Any Warning logs
  • Link State Lost (Id: 37)
  • Receive Link Closed (Id: 38)
  • Receive Link Create Start (Id: 83)
  • Receive Link Create End (Id: 84)
  • Receive Link Create Error (Id: 85)

I'd suggest capturing at the Verbose level and filtering down to that set. Information about capturing SDK logs can be found here.

@jsquire jsquire added needs-author-feedback Workflow: More information is needed from author to address the issue. and removed needs-team-triage Workflow: This issue needs the team to triage. labels Jul 6, 2023
@github-actions
Copy link

github-actions bot commented Jul 6, 2023

Hi @pzaj2. Thank you for opening this issue and giving us the opportunity to assist. To help our team better understand your issue and the details of your scenario please provide a response to the question asked above or the information requested above. This will help us more accurately address your issue.

@pzaj2
Copy link
Author

pzaj2 commented Jul 6, 2023

Hi @jsquire , I'll try my best to provide these additional logs as soon as possible. The reason I think it stops renewing the lock is that it stops producing log entries associated with lock renewal.

If it is, indeed an intermittent issue, how would I go about mitigating it? Given the consequences of this failure, I'd think it's still worth fixing. The workaround I implemented at the moment looks like this (first lines of code of my method):

var cts = CancellationTokenSource.CreateLinkedTokenSource(cancellationToken);
        var renewLockTask = Task.Run(async () =>
        {
            while (cts.Token.IsCancellationRequested == false)
            {
                await messageActions.RenewMessageLockAsync(message, cts.Token);
                await Task.Delay(RenewLockFrequencyInMilliseconds, cts.Token);
            }
        }, cts.Token);

where cancellationToken, messageActions (ServiceBusMessageActions) and message (ServiceBusReceivedMessage) are all provided by the binding provider to the method invoked. That said it's an ugly workaround and while it seems to work at the moment, I'm concerned about its robustness and reliability long-term.

@github-actions github-actions bot added needs-team-attention Workflow: This issue needs attention from Azure service team or SDK team and removed needs-author-feedback Workflow: More information is needed from author to address the issue. labels Jul 6, 2023
@jsquire
Copy link
Member

jsquire commented Jul 6, 2023

Hi @pzaj2. To be clear - if this does root cause to a failure for locks being renewed, then it is absolutely a client bug that we should fix.

However, if it is the result of intermittent network issues causing the connection or link to drop, there's nothing that your application or the client can do to directly prevent it, unfortunately. It is something that would need to be mitigated by ensuring that the application's processing is idempotent and can ignoring duplicate data.

Thus far, we haven't been able to repro and are not seeing stress test failures for this scenario. Logs are going to be our best bet, assuming that you're able to repro.

The long-term solution is for Service Bus to support AMQP's durable terminus, which allows for link state to be persistent across connections. Once the service has support, we'll add it to the client which would mitigate the "I lost my connection and now my lock is invalid" scenario. I do not have insight into the timing for the service feature, however, only that it is on the roadmap.

@jsquire jsquire added the needs-author-feedback Workflow: More information is needed from author to address the issue. label Jul 6, 2023
@github-actions github-actions bot removed the needs-team-attention Workflow: This issue needs attention from Azure service team or SDK team label Jul 6, 2023
@github-actions
Copy link

github-actions bot commented Jul 6, 2023

Hi @pzaj2. Thank you for opening this issue and giving us the opportunity to assist. To help our team better understand your issue and the details of your scenario please provide a response to the question asked above or the information requested above. This will help us more accurately address your issue.

@pzaj2
Copy link
Author

pzaj2 commented Jul 7, 2023

Hi @jsquire , please see the logs below - I hope that's enough, since I followed the guidelines to the letter!


2023-07-07T12:59:38.1977198Z	INFO	[Azure.Messaging.ServiceBus]	[RenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: RenewLockAsync start. MessageCount = 1, LockToken = 7722d244-7960-4b30-8be4-7795146b1eab
2023-07-07T12:59:38.2024792Z	INFO	[Azure.Messaging.ServiceBus]	[RenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: RenewLockAsync start. MessageCount = 1, LockToken = 7722d244-7960-4b30-8be4-7795146b1eab
2023-07-07T12:59:38.2418252Z	INFO	[Azure.Messaging.ServiceBus]	[RenewMessageLockCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: RenewLockAsync done. LockToken = 7722d244-7960-4b30-8be4-7795146b1eab
2023-07-07T12:59:38.2467021Z	INFO	[Azure.Messaging.ServiceBus]	[RenewMessageLockCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: RenewLockAsync done. LockToken = 7722d244-7960-4b30-8be4-7795146b1eab
2023-07-07T12:59:38.2516894Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock complete. LockToken = 7722d244-7960-4b30-8be4-7795146b1eab
2023-07-07T12:59:38.2571921Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock complete. LockToken = 7722d244-7960-4b30-8be4-7795146b1eab
2023-07-07T12:59:38.2621284Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock start. MessageCount = 1, LockToken = 7722d244-7960-4b30-8be4-7795146b1eab
2023-07-07T12:59:38.2671503Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock start. MessageCount = 1, LockToken = 7722d244-7960-4b30-8be4-7795146b1eab
2023-07-07T12:59:45.5698460Z	INFO	[Azure.Messaging.ServiceBus]	[RenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: RenewLockAsync start. MessageCount = 1, LockToken = 0a0f5db8-102f-4541-b31d-fb0d64c71d6b
2023-07-07T12:59:45.5759694Z	INFO	[Azure.Messaging.ServiceBus]	[RenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: RenewLockAsync start. MessageCount = 1, LockToken = 0a0f5db8-102f-4541-b31d-fb0d64c71d6b
2023-07-07T12:59:45.6143798Z	INFO	[Azure.Messaging.ServiceBus]	[RenewMessageLockCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: RenewLockAsync done. LockToken = 0a0f5db8-102f-4541-b31d-fb0d64c71d6b
2023-07-07T12:59:45.6207454Z	INFO	[Azure.Messaging.ServiceBus]	[RenewMessageLockCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: RenewLockAsync done. LockToken = 0a0f5db8-102f-4541-b31d-fb0d64c71d6b
2023-07-07T12:59:45.6261048Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock complete. LockToken = 0a0f5db8-102f-4541-b31d-fb0d64c71d6b
2023-07-07T12:59:45.6309382Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock complete. LockToken = 0a0f5db8-102f-4541-b31d-fb0d64c71d6b
2023-07-07T12:59:45.6356715Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock start. MessageCount = 1, LockToken = 0a0f5db8-102f-4541-b31d-fb0d64c71d6b
2023-07-07T12:59:45.6405083Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock start. MessageCount = 1, LockToken = 0a0f5db8-102f-4541-b31d-fb0d64c71d6b
2023-07-07T13:02:06.7262028Z	INFO	[Azure.Messaging.ServiceBus]	[RenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: RenewLockAsync start. MessageCount = 1, LockToken = d56c5fe9-ad44-4932-9bc4-f56ddd343029
2023-07-07T13:02:06.7361970Z	INFO	[Azure.Messaging.ServiceBus]	[RenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: RenewLockAsync start. MessageCount = 1, LockToken = d56c5fe9-ad44-4932-9bc4-f56ddd343029
2023-07-07T13:02:06.7884644Z	INFO	[Azure.Messaging.ServiceBus]	[RenewMessageLockCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: RenewLockAsync done. LockToken = d56c5fe9-ad44-4932-9bc4-f56ddd343029
2023-07-07T13:02:06.7976640Z	INFO	[Azure.Messaging.ServiceBus]	[RenewMessageLockCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: RenewLockAsync done. LockToken = d56c5fe9-ad44-4932-9bc4-f56ddd343029
2023-07-07T13:02:06.8077662Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock complete. LockToken = d56c5fe9-ad44-4932-9bc4-f56ddd343029
2023-07-07T13:02:06.8170884Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock complete. LockToken = d56c5fe9-ad44-4932-9bc4-f56ddd343029
2023-07-07T13:02:06.8258517Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock start. MessageCount = 1, LockToken = d56c5fe9-ad44-4932-9bc4-f56ddd343029
2023-07-07T13:02:06.8324029Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock start. MessageCount = 1, LockToken = d56c5fe9-ad44-4932-9bc4-f56ddd343029
2023-07-07T13:02:14.1417441Z	INFO	[Azure.Messaging.ServiceBus]	[RenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: RenewLockAsync start. MessageCount = 1, LockToken = 596c87e9-9ac0-4926-9efe-21b0cccc43c8
2023-07-07T13:02:14.1532334Z	INFO	[Azure.Messaging.ServiceBus]	[RenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: RenewLockAsync start. MessageCount = 1, LockToken = 596c87e9-9ac0-4926-9efe-21b0cccc43c8
2023-07-07T13:02:14.1959182Z	INFO	[Azure.Messaging.ServiceBus]	[RenewMessageLockCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: RenewLockAsync done. LockToken = 596c87e9-9ac0-4926-9efe-21b0cccc43c8
2023-07-07T13:02:14.2053262Z	INFO	[Azure.Messaging.ServiceBus]	[RenewMessageLockCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: RenewLockAsync done. LockToken = 596c87e9-9ac0-4926-9efe-21b0cccc43c8
2023-07-07T13:02:14.2169851Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock complete. LockToken = 596c87e9-9ac0-4926-9efe-21b0cccc43c8
2023-07-07T13:02:14.2258527Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock complete. LockToken = 596c87e9-9ac0-4926-9efe-21b0cccc43c8
2023-07-07T13:02:14.2356772Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock start. MessageCount = 1, LockToken = 596c87e9-9ac0-4926-9efe-21b0cccc43c8
2023-07-07T13:02:14.2444676Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock start. MessageCount = 1, LockToken = 596c87e9-9ac0-4926-9efe-21b0cccc43c8
2023-07-07T13:04:27.5732571Z	INFO	[Azure.Messaging.ServiceBus]	[RenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: RenewLockAsync start. MessageCount = 1, LockToken = 7722d244-7960-4b30-8be4-7795146b1eab
2023-07-07T13:04:27.5786058Z	INFO	[Azure.Messaging.ServiceBus]	[RenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: RenewLockAsync start. MessageCount = 1, LockToken = 7722d244-7960-4b30-8be4-7795146b1eab
2023-07-07T13:04:27.6195346Z	INFO	[Azure.Messaging.ServiceBus]	[RenewMessageLockCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: RenewLockAsync done. LockToken = 7722d244-7960-4b30-8be4-7795146b1eab
2023-07-07T13:04:27.6254957Z	INFO	[Azure.Messaging.ServiceBus]	[RenewMessageLockCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: RenewLockAsync done. LockToken = 7722d244-7960-4b30-8be4-7795146b1eab
2023-07-07T13:04:27.6315238Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock complete. LockToken = 7722d244-7960-4b30-8be4-7795146b1eab
2023-07-07T13:04:27.6368882Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock complete. LockToken = 7722d244-7960-4b30-8be4-7795146b1eab
2023-07-07T13:04:27.6421956Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock start. MessageCount = 1, LockToken = 7722d244-7960-4b30-8be4-7795146b1eab
2023-07-07T13:04:27.6479589Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock start. MessageCount = 1, LockToken = 7722d244-7960-4b30-8be4-7795146b1eab
2023-07-07T13:04:34.9526046Z	INFO	[Azure.Messaging.ServiceBus]	[RenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: RenewLockAsync start. MessageCount = 1, LockToken = 0a0f5db8-102f-4541-b31d-fb0d64c71d6b
2023-07-07T13:04:34.9603623Z	INFO	[Azure.Messaging.ServiceBus]	[RenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: RenewLockAsync start. MessageCount = 1, LockToken = 0a0f5db8-102f-4541-b31d-fb0d64c71d6b
2023-07-07T13:04:35.0002375Z	INFO	[Azure.Messaging.ServiceBus]	[RenewMessageLockCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: RenewLockAsync done. LockToken = 0a0f5db8-102f-4541-b31d-fb0d64c71d6b
2023-07-07T13:04:35.0072380Z	INFO	[Azure.Messaging.ServiceBus]	[RenewMessageLockCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: RenewLockAsync done. LockToken = 0a0f5db8-102f-4541-b31d-fb0d64c71d6b
2023-07-07T13:04:35.0158660Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock complete. LockToken = 0a0f5db8-102f-4541-b31d-fb0d64c71d6b
2023-07-07T13:04:35.0235132Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock complete. LockToken = 0a0f5db8-102f-4541-b31d-fb0d64c71d6b
2023-07-07T13:04:35.0302910Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock start. MessageCount = 1, LockToken = 0a0f5db8-102f-4541-b31d-fb0d64c71d6b
2023-07-07T13:04:35.0364808Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock start. MessageCount = 1, LockToken = 0a0f5db8-102f-4541-b31d-fb0d64c71d6b
2023-07-07T13:06:48.6978389Z	INFO	[Function.TestLockLost]	[FunctionCompleted]	Executed 'TestLockLost' (Succeeded, Id=a1005235-a933-4171-bbf4-405af7f0dc2d, Duration=6257133ms)
2023-07-07T13:06:48.7274496Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorMessageHandlerCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: User message handler complete: Message: SequenceNumber: 241, LockToken: 11571d5a-9b41-458e-a87d-7cd3026e049a
2023-07-07T13:06:48.7547564Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorMessageHandlerCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: User message handler complete: Message: SequenceNumber: 241, LockToken: 11571d5a-9b41-458e-a87d-7cd3026e049a
2023-07-07T13:06:48.7754853Z	INFO	[Azure.Messaging.ServiceBus]	[CompleteMessageStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: CompleteAsync start. MessageCount = 1, LockToken = 11571d5a-9b41-458e-a87d-7cd3026e049a
2023-07-07T13:06:48.7883165Z	INFO	[Azure.Messaging.ServiceBus]	[CompleteMessageStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: CompleteAsync start. MessageCount = 1, LockToken = 11571d5a-9b41-458e-a87d-7cd3026e049a
2023-07-07T13:06:49.6438384Z	FAIL	[Azure.Messaging.ServiceBus]	[CompleteMessageExceptionCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: CompleteAsync Exception: Azure.Messaging.ServiceBus.ServiceBusException: The lock supplied is invalid. Either the lock expired, or the message has already been removed from the queue. For more information please see https://aka.ms/ServiceBusExceptions . Reference:278d2a8b-8a23-4c93-82c7-f5733ebb3a12, TrackingId:7f0e2f9700027d4e018b0e4b64a7f487_G27_B48, SystemTracker:G27:231631968:amqps://mi-admin-local-svcbus.servicebus.windows.net/-78a83eb1;0:5:6:source(address:/test-topic/Subscriptions/test-sub,filter:[]), Timestamp:2023-07-07T13:06:48 (MessageLockLost). For troubleshooting information, see https://aka.ms/azsdk/net/servicebus/exceptions/troubleshoot.
   at Azure.Messaging.ServiceBus.Amqp.AmqpReceiver.DisposeMessageAsync(Guid lockToken, Outcome outcome, TimeSpan timeout)
   at Azure.Messaging.ServiceBus.Amqp.AmqpReceiver.CompleteInternalAsync(Guid lockToken, TimeSpan timeout)
   at Azure.Messaging.ServiceBus.Amqp.AmqpReceiver.<>c.<<CompleteAsync>b__47_0>d.MoveNext()
--- End of stack trace from previous location ---
   at Azure.Messaging.ServiceBus.ServiceBusRetryPolicy.<>c__22`1.<<RunOperation>b__22_0>d.MoveNext()
--- End of stack trace from previous location ---
   at Azure.Messaging.ServiceBus.ServiceBusRetryPolicy.RunOperation[T1,TResult](Func`4 operation, T1 t1, TransportConnectionScope scope, CancellationToken cancellationToken, Boolean logTimeoutRetriesAsVerbose)
   at Azure.Messaging.ServiceBus.ServiceBusRetryPolicy.RunOperation[T1,TResult](Func`4 operation, T1 t1, TransportConnectionScope scope, CancellationToken cancellationToken, Boolean logTimeoutRetriesAsVerbose)
   at Azure.Messaging.ServiceBus.ServiceBusRetryPolicy.RunOperation[T1](Func`4 operation, T1 t1, TransportConnectionScope scope, CancellationToken cancellationToken)
   at Azure.Messaging.ServiceBus.Amqp.AmqpReceiver.CompleteAsync(Guid lockToken, CancellationToken cancellationToken)
   at Azure.Messaging.ServiceBus.ServiceBusReceiver.CompleteMessageAsync(ServiceBusReceivedMessage message, CancellationToken cancellationToken). LockToken = 11571d5a-9b41-458e-a87d-7cd3026e049a
2023-07-07T13:06:49.6529822Z	FAIL	[Azure.Messaging.ServiceBus]	[CompleteMessageExceptionCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: CompleteAsync Exception: Azure.Messaging.ServiceBus.ServiceBusException: The lock supplied is invalid. Either the lock expired, or the message has already been removed from the queue. For more information please see https://aka.ms/ServiceBusExceptions . Reference:278d2a8b-8a23-4c93-82c7-f5733ebb3a12, TrackingId:7f0e2f9700027d4e018b0e4b64a7f487_G27_B48, SystemTracker:G27:231631968:amqps://mi-admin-local-svcbus.servicebus.windows.net/-78a83eb1;0:5:6:source(address:/test-topic/Subscriptions/test-sub,filter:[]), Timestamp:2023-07-07T13:06:48 (MessageLockLost). For troubleshooting information, see https://aka.ms/azsdk/net/servicebus/exceptions/troubleshoot.
   at Azure.Messaging.ServiceBus.Amqp.AmqpReceiver.DisposeMessageAsync(Guid lockToken, Outcome outcome, TimeSpan timeout)
   at Azure.Messaging.ServiceBus.Amqp.AmqpReceiver.CompleteInternalAsync(Guid lockToken, TimeSpan timeout)
   at Azure.Messaging.ServiceBus.Amqp.AmqpReceiver.<>c.<<CompleteAsync>b__47_0>d.MoveNext()
--- End of stack trace from previous location ---
   at Azure.Messaging.ServiceBus.ServiceBusRetryPolicy.<>c__22`1.<<RunOperation>b__22_0>d.MoveNext()
--- End of stack trace from previous location ---
   at Azure.Messaging.ServiceBus.ServiceBusRetryPolicy.RunOperation[T1,TResult](Func`4 operation, T1 t1, TransportConnectionScope scope, CancellationToken cancellationToken, Boolean logTimeoutRetriesAsVerbose)
   at Azure.Messaging.ServiceBus.ServiceBusRetryPolicy.RunOperation[T1,TResult](Func`4 operation, T1 t1, TransportConnectionScope scope, CancellationToken cancellationToken, Boolean logTimeoutRetriesAsVerbose)
   at Azure.Messaging.ServiceBus.ServiceBusRetryPolicy.RunOperation[T1](Func`4 operation, T1 t1, TransportConnectionScope scope, CancellationToken cancellationToken)
   at Azure.Messaging.ServiceBus.Amqp.AmqpReceiver.CompleteAsync(Guid lockToken, CancellationToken cancellationToken)
   at Azure.Messaging.ServiceBus.ServiceBusReceiver.CompleteMessageAsync(ServiceBusReceivedMessage message, CancellationToken cancellationToken). LockToken = 11571d5a-9b41-458e-a87d-7cd3026e049a
2023-07-07T13:06:49.8481376Z	FAIL	[Microsoft.Azure.WebJobs.ServiceBus.Listeners.ServiceBusListener]	[0]	Message processing error (Action=Complete, EntityPath=test-topic/Subscriptions/test-sub, Endpoint=mi-admin-local-svcbus.servicebus.windows.net)Azure.Messaging.ServiceBus.ServiceBusException: The lock supplied is invalid. Either the lock expired, or the message has already been removed from the queue. For more information please see https://aka.ms/ServiceBusExceptions . Reference:278d2a8b-8a23-4c93-82c7-f5733ebb3a12, TrackingId:7f0e2f9700027d4e018b0e4b64a7f487_G27_B48, SystemTracker:G27:231631968:amqps://mi-admin-local-svcbus.servicebus.windows.net/-78a83eb1;0:5:6:source(address:/test-topic/Subscriptions/test-sub,filter:[]), Timestamp:2023-07-07T13:06:48 (MessageLockLost). For troubleshooting information, see https://aka.ms/azsdk/net/servicebus/exceptions/troubleshoot.
   at Azure.Messaging.ServiceBus.Amqp.AmqpReceiver.DisposeMessageAsync(Guid lockToken, Outcome outcome, TimeSpan timeout)
   at Azure.Messaging.ServiceBus.Amqp.AmqpReceiver.CompleteInternalAsync(Guid lockToken, TimeSpan timeout)
   at Azure.Messaging.ServiceBus.Amqp.AmqpReceiver.<>c.<<CompleteAsync>b__47_0>d.MoveNext()
--- End of stack trace from previous location ---
   at Azure.Messaging.ServiceBus.ServiceBusRetryPolicy.<>c__22`1.<<RunOperation>b__22_0>d.MoveNext()
--- End of stack trace from previous location ---
   at Azure.Messaging.ServiceBus.ServiceBusRetryPolicy.RunOperation[T1,TResult](Func`4 operation, T1 t1, TransportConnectionScope scope, CancellationToken cancellationToken, Boolean logTimeoutRetriesAsVerbose)
   at Azure.Messaging.ServiceBus.ServiceBusRetryPolicy.RunOperation[T1,TResult](Func`4 operation, T1 t1, TransportConnectionScope scope, CancellationToken cancellationToken, Boolean logTimeoutRetriesAsVerbose)
   at Azure.Messaging.ServiceBus.ServiceBusRetryPolicy.RunOperation[T1](Func`4 operation, T1 t1, TransportConnectionScope scope, CancellationToken cancellationToken)
   at Azure.Messaging.ServiceBus.Amqp.AmqpReceiver.CompleteAsync(Guid lockToken, CancellationToken cancellationToken)
   at Azure.Messaging.ServiceBus.ServiceBusReceiver.CompleteMessageAsync(ServiceBusReceivedMessage message, CancellationToken cancellationToken)
   at Azure.Messaging.ServiceBus.ReceiverManager.ProcessOneMessage(ServiceBusReceivedMessage triggerMessage, CancellationToken cancellationToken)
2023-07-07T13:06:49.8589428Z	INFO	[Azure.Messaging.ServiceBus]	[ReceiveMessageStart]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: ReceiveBatchAsync start. MessageCount = 1
2023-07-07T13:06:49.8659790Z	INFO	[Azure.Messaging.ServiceBus]	[ReceiveMessageStart]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: ReceiveBatchAsync start. MessageCount = 1
2023-07-07T13:06:49.9077127Z	INFO	[Azure.Messaging.ServiceBus]	[ReceiveMessageCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: ReceiveBatchAsync done. Received '1' messages. LockTokens = <LockToken>e318f9a9-5eb8-4189-84d2-8c6978910374</LockToken>
2023-07-07T13:06:49.9153285Z	INFO	[Azure.Messaging.ServiceBus]	[ReceiveMessageCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: ReceiveBatchAsync done. Received '1' messages. LockTokens = <LockToken>e318f9a9-5eb8-4189-84d2-8c6978910374</LockToken>
2023-07-07T13:06:49.9250035Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock start. MessageCount = 1, LockToken = e318f9a9-5eb8-4189-84d2-8c6978910374
2023-07-07T13:06:49.9324253Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock start. MessageCount = 1, LockToken = e318f9a9-5eb8-4189-84d2-8c6978910374
2023-07-07T13:06:49.9411466Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorMessageHandlerStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: User message handler start: Message: SequenceNumber: 256, LockToken: e318f9a9-5eb8-4189-84d2-8c6978910374
2023-07-07T13:06:49.9492846Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorMessageHandlerStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: User message handler start: Message: SequenceNumber: 256, LockToken: e318f9a9-5eb8-4189-84d2-8c6978910374
2023-07-07T13:06:49.9576375Z	INFO	[Function.TestLockLost]	[FunctionStarted]	Executing 'TestLockLost' (Reason='(null)', Id=b00d78c4-5791-4dbf-841f-22cf41ed9f3a)
2023-07-07T13:06:49.9649452Z	INFO	[Function.TestLockLost]	[0]	Trigger Details: MessageId: 4512c9abaacb45ab8c82dd594d52a2f4, SequenceNumber: 256, DeliveryCount: 2, EnqueuedTimeUtc: 2023-07-07T11:17:30.7000000+00:00, LockedUntilUtc: 2023-07-07T13:11:49.2340000+00:00, SessionId: (null)
2023-07-07T13:06:56.2234881Z	INFO	[Azure.Messaging.ServiceBus]	[RenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: RenewLockAsync start. MessageCount = 1, LockToken = d56c5fe9-ad44-4932-9bc4-f56ddd343029
2023-07-07T13:06:56.2315799Z	INFO	[Azure.Messaging.ServiceBus]	[RenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: RenewLockAsync start. MessageCount = 1, LockToken = d56c5fe9-ad44-4932-9bc4-f56ddd343029
2023-07-07T13:06:56.2729874Z	INFO	[Azure.Messaging.ServiceBus]	[RenewMessageLockCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: RenewLockAsync done. LockToken = d56c5fe9-ad44-4932-9bc4-f56ddd343029
2023-07-07T13:06:56.2809382Z	INFO	[Azure.Messaging.ServiceBus]	[RenewMessageLockCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: RenewLockAsync done. LockToken = d56c5fe9-ad44-4932-9bc4-f56ddd343029
2023-07-07T13:06:56.2892855Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock complete. LockToken = d56c5fe9-ad44-4932-9bc4-f56ddd343029
2023-07-07T13:06:56.2957093Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock complete. LockToken = d56c5fe9-ad44-4932-9bc4-f56ddd343029
2023-07-07T13:06:56.3043456Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock start. MessageCount = 1, LockToken = d56c5fe9-ad44-4932-9bc4-f56ddd343029
2023-07-07T13:06:56.3111042Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock start. MessageCount = 1, LockToken = d56c5fe9-ad44-4932-9bc4-f56ddd343029
2023-07-07T13:07:03.5356504Z	INFO	[Azure.Messaging.ServiceBus]	[RenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: RenewLockAsync start. MessageCount = 1, LockToken = 596c87e9-9ac0-4926-9efe-21b0cccc43c8
2023-07-07T13:07:03.5463614Z	INFO	[Azure.Messaging.ServiceBus]	[RenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: RenewLockAsync start. MessageCount = 1, LockToken = 596c87e9-9ac0-4926-9efe-21b0cccc43c8
2023-07-07T13:07:03.5878080Z	INFO	[Azure.Messaging.ServiceBus]	[RenewMessageLockCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: RenewLockAsync done. LockToken = 596c87e9-9ac0-4926-9efe-21b0cccc43c8
2023-07-07T13:07:03.5963510Z	INFO	[Azure.Messaging.ServiceBus]	[RenewMessageLockCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: RenewLockAsync done. LockToken = 596c87e9-9ac0-4926-9efe-21b0cccc43c8
2023-07-07T13:07:03.6564759Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock complete. LockToken = 596c87e9-9ac0-4926-9efe-21b0cccc43c8
2023-07-07T13:07:03.6669621Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock complete. LockToken = 596c87e9-9ac0-4926-9efe-21b0cccc43c8
2023-07-07T13:07:03.6766371Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock start. MessageCount = 1, LockToken = 596c87e9-9ac0-4926-9efe-21b0cccc43c8
2023-07-07T13:07:03.6829058Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock start. MessageCount = 1, LockToken = 596c87e9-9ac0-4926-9efe-21b0cccc43c8
2023-07-07T13:09:16.9898650Z	INFO	[Azure.Messaging.ServiceBus]	[RenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: RenewLockAsync start. MessageCount = 1, LockToken = 7722d244-7960-4b30-8be4-7795146b1eab
2023-07-07T13:09:17.0190443Z	INFO	[Azure.Messaging.ServiceBus]	[RenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: RenewLockAsync start. MessageCount = 1, LockToken = 7722d244-7960-4b30-8be4-7795146b1eab
2023-07-07T13:09:17.0670662Z	INFO	[Azure.Messaging.ServiceBus]	[RenewMessageLockCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: RenewLockAsync done. LockToken = 7722d244-7960-4b30-8be4-7795146b1eab
2023-07-07T13:09:17.0837866Z	INFO	[Azure.Messaging.ServiceBus]	[RenewMessageLockCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: RenewLockAsync done. LockToken = 7722d244-7960-4b30-8be4-7795146b1eab
2023-07-07T13:09:17.0971446Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock complete. LockToken = 7722d244-7960-4b30-8be4-7795146b1eab
2023-07-07T13:09:17.1033924Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock complete. LockToken = 7722d244-7960-4b30-8be4-7795146b1eab
2023-07-07T13:09:17.1095211Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock start. MessageCount = 1, LockToken = 7722d244-7960-4b30-8be4-7795146b1eab
2023-07-07T13:09:17.1162605Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock start. MessageCount = 1, LockToken = 7722d244-7960-4b30-8be4-7795146b1eab
2023-07-07T13:09:24.3397129Z	INFO	[Azure.Messaging.ServiceBus]	[RenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: RenewLockAsync start. MessageCount = 1, LockToken = 0a0f5db8-102f-4541-b31d-fb0d64c71d6b
2023-07-07T13:09:24.3490555Z	INFO	[Azure.Messaging.ServiceBus]	[RenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: RenewLockAsync start. MessageCount = 1, LockToken = 0a0f5db8-102f-4541-b31d-fb0d64c71d6b
2023-07-07T13:09:24.3907608Z	INFO	[Azure.Messaging.ServiceBus]	[RenewMessageLockCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: RenewLockAsync done. LockToken = 0a0f5db8-102f-4541-b31d-fb0d64c71d6b
2023-07-07T13:09:24.4002318Z	INFO	[Azure.Messaging.ServiceBus]	[RenewMessageLockCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: RenewLockAsync done. LockToken = 0a0f5db8-102f-4541-b31d-fb0d64c71d6b
2023-07-07T13:09:24.4116821Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock complete. LockToken = 0a0f5db8-102f-4541-b31d-fb0d64c71d6b
2023-07-07T13:09:24.4196329Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock complete. LockToken = 0a0f5db8-102f-4541-b31d-fb0d64c71d6b
2023-07-07T13:09:24.4278918Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock start. MessageCount = 1, LockToken = 0a0f5db8-102f-4541-b31d-fb0d64c71d6b
2023-07-07T13:09:24.4360834Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock start. MessageCount = 1, LockToken = 0a0f5db8-102f-4541-b31d-fb0d64c71d6b
2023-07-07T13:11:39.2540222Z	INFO	[Azure.Messaging.ServiceBus]	[RenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: RenewLockAsync start. MessageCount = 1, LockToken = e318f9a9-5eb8-4189-84d2-8c6978910374
2023-07-07T13:11:39.2653557Z	INFO	[Azure.Messaging.ServiceBus]	[RenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: RenewLockAsync start. MessageCount = 1, LockToken = e318f9a9-5eb8-4189-84d2-8c6978910374
2023-07-07T13:11:39.3144934Z	INFO	[Azure.Messaging.ServiceBus]	[RenewMessageLockCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: RenewLockAsync done. LockToken = e318f9a9-5eb8-4189-84d2-8c6978910374
2023-07-07T13:11:39.3284840Z	INFO	[Azure.Messaging.ServiceBus]	[RenewMessageLockCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: RenewLockAsync done. LockToken = e318f9a9-5eb8-4189-84d2-8c6978910374
2023-07-07T13:11:39.3452052Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock complete. LockToken = e318f9a9-5eb8-4189-84d2-8c6978910374
2023-07-07T13:11:39.3603552Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock complete. LockToken = e318f9a9-5eb8-4189-84d2-8c6978910374
2023-07-07T13:11:39.3754840Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock start. MessageCount = 1, LockToken = e318f9a9-5eb8-4189-84d2-8c6978910374
2023-07-07T13:11:39.3903464Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock start. MessageCount = 1, LockToken = e318f9a9-5eb8-4189-84d2-8c6978910374
2023-07-07T13:11:45.6250706Z	INFO	[Azure.Messaging.ServiceBus]	[RenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: RenewLockAsync start. MessageCount = 1, LockToken = d56c5fe9-ad44-4932-9bc4-f56ddd343029
2023-07-07T13:11:45.6439902Z	INFO	[Azure.Messaging.ServiceBus]	[RenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: RenewLockAsync start. MessageCount = 1, LockToken = d56c5fe9-ad44-4932-9bc4-f56ddd343029
2023-07-07T13:11:45.6964420Z	INFO	[Azure.Messaging.ServiceBus]	[RenewMessageLockCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: RenewLockAsync done. LockToken = d56c5fe9-ad44-4932-9bc4-f56ddd343029
2023-07-07T13:11:45.7071201Z	INFO	[Azure.Messaging.ServiceBus]	[RenewMessageLockCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver: RenewLockAsync done. LockToken = d56c5fe9-ad44-4932-9bc4-f56ddd343029
2023-07-07T13:11:45.7192980Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock complete. LockToken = d56c5fe9-ad44-4932-9bc4-f56ddd343029
2023-07-07T13:11:45.7308851Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockCompleteCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock complete. LockToken = d56c5fe9-ad44-4932-9bc4-f56ddd343029
2023-07-07T13:11:45.7390228Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock start. MessageCount = 1, LockToken = d56c5fe9-ad44-4932-9bc4-f56ddd343029
2023-07-07T13:11:45.7507782Z	INFO	[Azure.Messaging.ServiceBus]	[ProcessorRenewMessageLockStartCore]	test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d: Processor RenewMessageLock start. MessageCount = 1, LockToken = d56c5fe9-ad44-4932-9bc4-f56ddd343029

I've only seen two instances of CreateReceiveLink (and the corresponding completion logs)

2023-07-07T11:18:11.1496322Z	INFO	[Azure.Messaging.ServiceBus]	[CreateReceiveLinkStart]	Creating receive link for Identifier: test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver.
2023-07-07T11:18:11.1559131Z	INFO	[Azure.Messaging.ServiceBus]	[CreateReceiveLinkStart]	Creating receive link for Identifier: test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver.
[...]
2023-07-07T11:18:32.4554430Z	INFO	[Azure.Messaging.ServiceBus]	[CreateReceiveLinkComplete]	Receive link created for Identifier: test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver. Session Id: 
2023-07-07T11:18:32.4612739Z	INFO	[Azure.Messaging.ServiceBus]	[CreateReceiveLinkComplete]	Receive link created for Identifier: test-topic/Subscriptions/test-sub-37276518-e13a-4362-8888-9bb7fb2d029d-Receiver. Session Id: 

You'll notice that they only happened once around the start-up time (almost 2 hrs before the exception) and never re-occurred.

Let me know if I can be of any more help!

@github-actions github-actions bot added needs-team-attention Workflow: This issue needs attention from Azure service team or SDK team and removed needs-author-feedback Workflow: More information is needed from author to address the issue. labels Jul 7, 2023
@jsquire
Copy link
Member

jsquire commented Jul 7, 2023

Thanks, @pzaj2. I don't see the pattern that I was looking for, so we'll need to dig deeper. I'm going to ask a colleague to step in and take the lead for the investigation.

@jsquire jsquire assigned JoshLove-msft and unassigned jsquire Jul 7, 2023
@jsquire
Copy link
Member

jsquire commented Jul 7, 2023

@JoshLove-msft : I wasn't able to repro locally or in Azure. Would you take point on digging into this, please?

@pzaj2
Copy link
Author

pzaj2 commented Jul 10, 2023

@jsquire sure thing. @JoshLove-msft should you need any further help from me please let me know!

@JoshLove-msft
Copy link
Member

@jsquire sure thing. @JoshLove-msft should you need any further help from me please let me know!

Sounds good - will take a look.

@JoshLove-msft
Copy link
Member

I haven't been able to reproduce this. The only thing that I can think may be causing this is that there is thread starvation in the application and the renew lock task is not being scheduled in time. A few things to try:
Can you pass an explicit value of MaxConcurrentCalls of 1 and see if it repros?
Are you able to collect a process dump when the issue repros?

@JoshLove-msft JoshLove-msft added the needs-author-feedback Workflow: More information is needed from author to address the issue. label Jul 11, 2023
@github-actions github-actions bot removed the needs-team-attention Workflow: This issue needs attention from Azure service team or SDK team label Jul 11, 2023
@github-actions
Copy link

Hi @pzaj2. Thank you for opening this issue and giving us the opportunity to assist. To help our team better understand your issue and the details of your scenario please provide a response to the question asked above or the information requested above. This will help us more accurately address your issue.

@pzaj2
Copy link
Author

pzaj2 commented Jul 12, 2023

I haven't been able to reproduce this. The only thing that I can think may be causing this is that there is thread starvation in the application and the renew lock task is not being scheduled in time. A few things to try: Can you pass an explicit value of MaxConcurrentCalls of 1 and see if it repros? Are you able to collect a process dump when the issue repros?

Right, I'll try reproducing it with concurrency switched off. I'll also try to collect the process dump while I'm on that (should I manage to reproduce it).

A couple of questions in the meantime:

  1. My understanding is that, if it's tread starvation, it would not sit with the Azure SDK - just to confirm.
  2. Is there any way (without gutting the SDK, that is) to decrease the interval in which the locks are renewed? I couldn't find anything but perhaps there's a hidden gem somewhere.

Anyhow, I'll let you know of any developments as I try to reproduce it and collect the process dump.

P.S. The workaround I mentioned earlier works fine - we haven't observed any retries nor LockLostException occurrences. Would thread starvation not affect this workaround? Just for context, the delay is configured to 4.5 minutes and the lock duration to 5 minutes, so perhaps that 30s gap makes it immune to potential thread starvation.

@github-actions github-actions bot added needs-team-attention Workflow: This issue needs attention from Azure service team or SDK team and removed needs-author-feedback Workflow: More information is needed from author to address the issue. labels Jul 12, 2023
@pzaj2
Copy link
Author

pzaj2 commented Jul 12, 2023

@JoshLove-msft following your request, I've had it running since early morning today. I only came across an exception around 1.45 pm (UK time) but I can confirm that it's still happening while MaxConcurrentCalls is set to 1. Could this be related to using AAD as a means to authenticate with the SB? It's configured to use the default credentials, so it goes through quite a few "providers" before reaching the one that actually works (locally at least).

I've attached unfiltered logs from the moment the repro app started, until about 2 pm (1 pm UK time) when I realised there's been an exception.
logs.txt

I also have the process dump (or at least I hope that's what you wanted!) but it's too big to be uploaded here, how can I share it? I uploaded it to Google Drive and here's the link: https://drive.google.com/file/d/1KdDd5bbC1iP_3EfP8pIyalJtULyOMvW8/view?usp=drive_link. Unfortunately, I can't share it publicly due to company's Google Drive limitations, but if you request access I'll grant it asap.

If for any reason you'd prefer me to share it in a different way, please let me know. It's also from 20 minutes after the initial exception, as I only then noticed that there's been an exception.

@JoshLove-msft
Copy link
Member

A couple of questions in the meantime:

  1. My understanding is that, if it's tread starvation, it would not sit with the Azure SDK - just to confirm.

It would depend on a lot of factors, like your machine, the .NET version being used, and your application code in addition to the SDK code.

  1. Is there any way (without gutting the SDK, that is) to decrease the interval in which the locks are renewed? I couldn't find anything but perhaps there's a hidden gem somewhere.

The lock renewal interval is based on the lock duration for the entity. You can make the lock duration longer via the Portal or management APIs so that the lock renewals will be less frequent.

@JoshLove-msft
Copy link
Member

Looking at these logs, there definitely seems to be something related to starvation. The below log entries should be emitted ever 59 seconds:

2023-07-12T12:25:05.3883589Z	INFO	[TestLockLost.TestTrigger]	[0]	Current Time is 07/12/2023 12:25:05, iteration 1, running until 07/12/2023 12:45:06
2023-07-12T12:26:04.3981848Z	INFO	[TestLockLost.TestTrigger]	[0]	Current Time is 07/12/2023 12:26:04, iteration 2, running until 07/12/2023 12:45:06
2023-07-12T12:30:02.2429996Z	INFO	[TestLockLost.TestTrigger]	[0]	Current Time is 07/12/2023 12:30:02, iteration 3, running until 07/12/2023 12:45:06

Note that there is almost 4 minutes between the 2nd and 3rd.

Could you share your updated app?

@pzaj2
Copy link
Author

pzaj2 commented Jul 14, 2023

Looking at these logs, there definitely seems to be something related to starvation. The below log entries should be emitted ever 59 seconds:

2023-07-12T12:25:05.3883589Z	INFO	[TestLockLost.TestTrigger]	[0]	Current Time is 07/12/2023 12:25:05, iteration 1, running until 07/12/2023 12:45:06
2023-07-12T12:26:04.3981848Z	INFO	[TestLockLost.TestTrigger]	[0]	Current Time is 07/12/2023 12:26:04, iteration 2, running until 07/12/2023 12:45:06
2023-07-12T12:30:02.2429996Z	INFO	[TestLockLost.TestTrigger]	[0]	Current Time is 07/12/2023 12:30:02, iteration 3, running until 07/12/2023 12:45:06

Note that there is almost 4 minutes between the 2nd and 3rd.

Could you share your updated app?

Oh, I'll be honest - I haven't really paid much attention to these logs, nice catch!

Apologies for the delays, I've had to prioritise other things over the last 2 days. Here's the repro app including the updates. Please bear in mind that the exceptions occur both with and without the file logging provider - I've only added it to collect the logs.
ServiceBusLockLostRepro.zip

@pzaj2
Copy link
Author

pzaj2 commented Jul 17, 2023

The lock renewal interval is based on the lock duration for the entity. You can make the lock duration longer via the Portal or management APIs so that the lock renewals will be less frequent.

And one more thing on that since I just remembered. I should've been more clear on that, what I'm looking for is the ability to increase the gap between the renewal and the expiry, i.e. if the renewal starts 5 seconds before the lock would've otherwise expired, I'd like to increase that to 30 seconds. While this isn't ideal, it'd most likely resolve the issue in the short/mid-term for people who struggle with this, until the long-term solution (AMQP's durable terminus) is in place.

@JoshLove-msft
Copy link
Member

And one more thing on that since I just remembered. I should've been more clear on that, what I'm looking for is the ability to increase the gap between the renewal and the expiry, i.e. if the renewal starts 5 seconds before the lock would've otherwise expired, I'd like to increase that to 30 seconds.

This isn't configurable. I've been unable to repro the issue but we have recently made a fix to how settlement works so that you are less likely to hit these lock lost errors - #37704. This will be released in August.

@JoshLove-msft
Copy link
Member

Closing this out as I haven't been able to repro and we have made some improvements to make lock lost issues less likely to occur.

@github-actions github-actions bot locked and limited conversation to collaborators Nov 1, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Client This issue points to a problem in the data-plane of the library. customer-reported Issues that are reported by GitHub users external to the Azure organization. needs-team-attention Workflow: This issue needs attention from Azure service team or SDK team question The issue doesn't require a change to the product in order to be resolved. Most issues start as that Service Bus
Projects
None yet
Development

No branches or pull requests

3 participants