Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Microsoft.Azure.ServiceBus - System.InvalidOperationException: Can't create session when the connection is closing #13637

Closed
mr-davidc opened this issue Jul 22, 2020 · 18 comments
Assignees
Labels
Client This issue points to a problem in the data-plane of the library. customer-reported Issues that are reported by GitHub users external to the Azure organization. needs-team-attention Workflow: This issue needs attention from Azure service team or SDK team question The issue doesn't require a change to the product in order to be resolved. Most issues start as that Service Attention Workflow: This issue is responsible by Azure service team. Service Bus

Comments

@mr-davidc
Copy link

mr-davidc commented Jul 22, 2020

This issue is related to #9416 however I was asked to open a fresh thread.

Describe the bug

Intermittently for quite some time our Azure function instances running in AKS have been receiving the below exceptions coming through into Sentry.

We have a pod running .NET Core 2.2.8 with Functions v2 in our Production Kubernetes cluster and a different pod running .NET Core 3.1.5 with Functions v3 in our Sandbox cluster after recently upgrading and the exceptions are still being received from both pods intermittently. It seems to happen at random times, often days apart. I hoped upgrading to Function V3 might help to resolve the issue but alas it persists.

The production Functions pod references Microsoft.Azure.ServiceBus v3.4.0 and the sandbox Functions pod references Microsoft.Azure.ServiceBus v4.1.3.

The exception also seems to occur regardless of whether the function definition is for a Queue or Topic trigger.

Actual behavior (include Exception or Stack Trace)

Exception message:
Message processing error (Action=Receive, ClientId=MessageReceiver12account-events/Subscriptions/new-account-setup, EntityPath=account-events/Subscriptions/new-account-setup, Endpoint=sndbx-sb-project-au.servicebus.windows.net)

Note: It happens with lots of different service bus queues/topics, the exception message often relates to a different queue/topic each time.

Stack Trace:
System.InvalidOperationException: Can't create session when the connection is closing.
Module "Microsoft.Azure.ServiceBus.Core.MessageReceiver", in OnReceiveAsync
Module "System.Runtime.ExceptionServices.ExceptionDispatchInfo", in Throw
Module "System.Runtime.CompilerServices.TaskAwaiter", in ThrowForNonSuccess
Module "System.Runtime.CompilerServices.TaskAwaiter", in HandleNonSuccessAndDebuggerNotification
Module "Microsoft.Azure.ServiceBus.Core.MessageReceiver+<>c__DisplayClass64_0+<b__0>d", in MoveNext
Module "System.Runtime.ExceptionServices.ExceptionDispatchInfo", in Throw
Module "System.Runtime.CompilerServices.TaskAwaiter", in ThrowForNonSuccess
Module "Microsoft.Azure.ServiceBus.RetryPolicy", in RunOperation
Module "System.Runtime.ExceptionServices.ExceptionDispatchInfo", in Throw
Module "Microsoft.Azure.ServiceBus.RetryPolicy", in RunOperation
Module "System.Runtime.ExceptionServices.ExceptionDispatchInfo", in Throw
Module "System.Runtime.CompilerServices.TaskAwaiter", in ThrowForNonSuccess
Module "System.Runtime.CompilerServices.TaskAwaiter", in HandleNonSuccessAndDebuggerNotification
Module "Microsoft.Azure.ServiceBus.Core.MessageReceiver", in ReceiveAsync
Module "System.Runtime.ExceptionServices.ExceptionDispatchInfo", in Throw
Module "System.Runtime.CompilerServices.TaskAwaiter", in ThrowForNonSuccess
Module "System.Runtime.CompilerServices.TaskAwaiter", in HandleNonSuccessAndDebuggerNotification
Module "Microsoft.Azure.ServiceBus.Core.MessageReceiver", in ReceiveAsync
Module "System.Runtime.ExceptionServices.ExceptionDispatchInfo", in Throw
Module "System.Runtime.CompilerServices.TaskAwaiter", in ThrowForNonSuccess
Module "System.Runtime.CompilerServices.TaskAwaiter", in HandleNonSuccessAndDebuggerNotification
Module "Microsoft.Azure.ServiceBus.MessageReceivePump+<b__11_0>d", in MoveNext

Another interesting piece of info, is that I am also receiving this exception as well at essentially the same time:

System.ObjectDisposedException: Cannot access a disposed object.
Object name: '$cbs'.
Module "Microsoft.Azure.ServiceBus.Core.MessageReceiver", in OnReceiveAsync
Module "System.Runtime.ExceptionServices.ExceptionDispatchInfo", in Throw
Module "System.Runtime.CompilerServices.TaskAwaiter", in ThrowForNonSuccess
Module "System.Runtime.CompilerServices.TaskAwaiter", in HandleNonSuccessAndDebuggerNotification
Module "Microsoft.Azure.ServiceBus.Core.MessageReceiver+<>c__DisplayClass64_0+<b__0>d", in MoveNext
Module "System.Runtime.ExceptionServices.ExceptionDispatchInfo", in Throw
Module "System.Runtime.CompilerServices.TaskAwaiter", in ThrowForNonSuccess
Module "Microsoft.Azure.ServiceBus.RetryPolicy", in RunOperation
Module "System.Runtime.ExceptionServices.ExceptionDispatchInfo", in Throw
Module "Microsoft.Azure.ServiceBus.RetryPolicy", in RunOperation
Module "System.Runtime.ExceptionServices.ExceptionDispatchInfo", in Throw
Module "System.Runtime.CompilerServices.TaskAwaiter", in ThrowForNonSuccess
Module "System.Runtime.CompilerServices.TaskAwaiter", in HandleNonSuccessAndDebuggerNotification
Module "Microsoft.Azure.ServiceBus.Core.MessageReceiver", in ReceiveAsync
Module "System.Runtime.ExceptionServices.ExceptionDispatchInfo", in Throw
Module "System.Runtime.CompilerServices.TaskAwaiter", in ThrowForNonSuccess
Module "System.Runtime.CompilerServices.TaskAwaiter", in HandleNonSuccessAndDebuggerNotification
Module "Microsoft.Azure.ServiceBus.Core.MessageReceiver", in ReceiveAsync
Module "System.Runtime.ExceptionServices.ExceptionDispatchInfo", in Throw
Module "System.Runtime.CompilerServices.TaskAwaiter", in ThrowForNonSuccess
Module "System.Runtime.CompilerServices.TaskAwaiter", in HandleNonSuccessAndDebuggerNotification
Module "Microsoft.Azure.ServiceBus.MessageReceivePump+<b__11_0>d", in MoveNext

To Reproduce
Not too sure since it happens intermittently once the Function project is deployed. I have never encountered this exception when debugging locally.

An example of one of the Topic trigger function definitions is:

[FunctionName(nameof(AccountSetupTrigger))]
public async Task Run([ServiceBusTrigger("account-events", "new-account-setup", Connection = "ServiceBus")] AccountSetupMessage message, ILogger logger)

This is the csproj file (for the Sandbox Functions V3):

<Project Sdk="Microsoft.NET.Sdk">
  <PropertyGroup>
    <TargetFramework>netcoreapp3.1</TargetFramework>
    <AzureFunctionsVersion>v3</AzureFunctionsVersion>
  </PropertyGroup>
  
  <ItemGroup>
    <PackageReference Include="Microsoft.Azure.Functions.Extensions" Version="1.0.0" />
    <PackageReference Include="Microsoft.Azure.WebJobs.Extensions.ServiceBus" Version="4.1.2" />
    <PackageReference Include="Microsoft.NET.Sdk.Functions" Version="3.0.7" />
    <PackageReference Include="Sentry.AspNetCore" Version="2.1.4" />
    <PackageReference Include="Stripe.net" Version="34.16.0" />
    <PackageReference Include="Xero.Api.SDK.Core" Version="1.1.4" />
  </ItemGroup>
  
  <ItemGroup>
    <ProjectReference Include="..\Common\Common.csproj" />
    <ProjectReference Include="..\Data\Data.csproj" />
  </ItemGroup>
  
  <ItemGroup>
    <None Update="host.json">
      <CopyToOutputDirectory>Always</CopyToOutputDirectory>
    </None>
    <None Update="appsettings.json">
      <CopyToOutputDirectory>Always</CopyToOutputDirectory>
    </None>
    <None Update="local.settings.json">
      <CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
      <CopyToPublishDirectory>Never</CopyToPublishDirectory>
    </None>
  </ItemGroup>
</Project>

And the host.json file:

{
  "version": "2.0",
  "logging": {
    "logLevel": {
      "default": "Information"
    }
  },
  "extensions": {
    "serviceBus": {
      "messageHandlerOptions": {
        "maxConcurrentCalls": 1
      }
    }
  },
  "functions": [
    "AccountSetupTrigger",
    // Lots of other triggers here too...
  ]
}

Environment:

  • Microsoft.Azure.ServiceBus 4.13.0 and Microsoft.Azure.ServiceBus 3.4.0
  • Azure Functions V2 and V3
  • .NET Core 2.2.8 and .NET Core 3.1.5
  • AKS v1.14.8

Let me know if you require any more information and thanks in advance for your assistance.

@ghost ghost added needs-triage Workflow: This is a new issue that needs to be triaged to the appropriate team. customer-reported Issues that are reported by GitHub users external to the Azure organization. question The issue doesn't require a change to the product in order to be resolved. Most issues start as that labels Jul 22, 2020
@mr-davidc mr-davidc changed the title [BUG] [BUG] Microsoft.Azure.ServiceBus - System.InvalidOperationException: Can't create session when the connection is closing Jul 22, 2020
@jsquire jsquire added Client This issue points to a problem in the data-plane of the library. needs-team-attention Workflow: This issue needs attention from Azure service team or SDK team Service Attention Workflow: This issue is responsible by Azure service team. Service Bus labels Jul 22, 2020
@ghost ghost removed the needs-triage Workflow: This is a new issue that needs to be triaged to the appropriate team. label Jul 22, 2020
@ghost
Copy link

ghost commented Jul 22, 2020

Thanks for the feedback! We are routing this to the appropriate team for follow-up. cc @jfggdl.

@DorothySun216
Copy link
Contributor

@mr-davidc Thanks for reaching out. You mentioned you have two environments and they use Microsoft.Azure.ServiceBus v3.4.0 and Microsoft.Azure.ServiceBus v4.1.3 respectively. We had a fix to convert this System.InvalidOperationException into ServiceBusCommunicationException in Microsoft.Azure.ServiceBus v4.1.0, so Microsoft.Azure.ServiceBus v4.1.3 should not see this error but Microsoft.Azure.ServiceBus v3.4.0 will likely to see since it didn't have this fix. Are you sure the stack trace you shared is from Microsoft.Azure.ServiceBus v4.1.3?

Is it convenient to use Microsoft.Azure.ServiceBus v4.1.3 in both environments and test it out?

@mr-davidc
Copy link
Author

Hi @DorothySun216, I just double checked and yes I can confirm the exceptions I am currently seeing are being thrown from within v4.1.3 of Microsoft.Azure.ServiceBus in our Sandbox environment.

The most recent set of exceptions recorded by Sentry was 4 days ago on 30-07. I have attached a screenshot of the Sentry page showing the version of the ServiceBus package and a screenshot of the number of exceptions received on that particular day. I should note that each of the exceptions in the list of exceptions refers to a different Service Bus queue/topic.

In terms of using v4.1.3 in both environments, unfortunately it is not possible as of yet as we are still waiting for some other development work to be completed before it can be rolled to our Production environment.

I am happy to provide any other information which might help track the issue down.

Thanks

image

image

@mr-davidc
Copy link
Author

@DorothySun216

Just 13 hours ago, the functions project experienced more of these types of exceptions. However, as an interesting point to add, I also have another completely different functions project (running the same Functions and ServiceBus DLL versions) deployed in the same cluster but a separate Kubernetes Pod which ALSO experienced the same exceptions at that time...

Any ideas?

@DorothySun216
Copy link
Contributor

@mr-davidc thanks for confirming on the version. Can you share with us a snippet of your code when you run into this error and we will see if we can repro it? Are you using ReceiveAsync or RegisterMessageHandler? We could try to translate this exception into communication exception if we can repro so the retry logic of SDK will auto retry if it's communication exception.

@mr-davidc
Copy link
Author

So that's the thing @DorothySun216, I'm not actually directly using ReceiveAsync or RegisterMessageHandler explicitly anywhere in my code. It's only in the function definition where I reference anything service bus related in the ServiceBusTrigger like so:

Queue trigger example:
public async Task Run([ServiceBusTrigger("update-account-balances", Connection = "ServiceBus")] OrganisationMessage message, ILogger logger)

Topic trigger example:
public async Task Run([ServiceBusTrigger("account-events", "new-account-setup", Connection = "ServiceBus")] AccountSetupMessage message, ILogger logger)

The exceptions appears to be coming from the functions runtime itself from the Host.Executor ?

@DorothySun216
Copy link
Contributor

@mr-davidc Thanks for the info. I will reach out to the Azure Functions team to see how they are calling our API internally since we don't have access to that code. Are you blocked on this issue?

@mr-davidc
Copy link
Author

@DorothySun216 No, not blocked as such but ideally I would like to get to a point where these exceptions no longer occur as they are essentially false positives at the moment.

Looking forward to seeing what you come back with after discussing with the Azure Functions team.

Thanks for your help.

@tharidu
Copy link

tharidu commented Aug 10, 2020

We are also experiencing the same exceptions (System.InvalidOperationException and System.ObjectDisposedException) in our Kubernetes cluster in AKS running .net Core 3.1. (Stack traces below)

It has started occurring from May this year and happens intermittently. We are using Microsoft.Azure.ServiceBus 4.1.1

{"assembly":"Microsoft.Azure.ServiceBus, Version=4.1.1.0, Culture=neutral, PublicKeyToken=7e34167dcc6d6d8c","method":"Microsoft.Azure.ServiceBus.Core.MessageReceiver+<OnReceiveAsync>d__86.MoveNext","level":0,"line":0},{"assembly":"System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e","method":"System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw","level":1,"line":0},{"assembly":"System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e","method":"System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification","level":2,"line":0},{"assembly":"Microsoft.Azure.ServiceBus, Version=4.1.1.0, Culture=neutral, PublicKeyToken=7e34167dcc6d6d8c","method":"Microsoft.Azure.ServiceBus.Core.MessageReceiver+<>c__DisplayClass64_0+<<ReceiveAsync>b__0>d.MoveNext","level":3,"line":0},{"assembly":"System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e","method":"System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw","level":4,"line":0},{"assembly":"Microsoft.Azure.ServiceBus, Version=4.1.1.0, Culture=neutral, PublicKeyToken=7e34167dcc6d6d8c","method":"Microsoft.Azure.ServiceBus.RetryPolicy+<RunOperation>d__19.MoveNext","level":5,"line":0},{"assembly":"System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e","method":"System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw","level":6,"line":0},{"assembly":"Microsoft.Azure.ServiceBus, Version=4.1.1.0, Culture=neutral, PublicKeyToken=7e34167dcc6d6d8c","method":"Microsoft.Azure.ServiceBus.RetryPolicy+<RunOperation>d__19.MoveNext","level":7,"line":0},{"assembly":"System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e","method":"System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw","level":8,"line":0},{"assembly":"System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e","method":"System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification","level":9,"line":0},{"assembly":"Microsoft.Azure.ServiceBus, Version=4.1.1.0, Culture=neutral, PublicKeyToken=7e34167dcc6d6d8c","method":"Microsoft.Azure.ServiceBus.Core.MessageReceiver+<ReceiveAsync>d__64.MoveNext","level":10,"line":0},{"assembly":"System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e","method":"System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw","level":11,"line":0},{"assembly":"System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e","method":"System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification","level":12,"line":0},{"assembly":"Microsoft.Azure.ServiceBus, Version=4.1.1.0, Culture=neutral, PublicKeyToken=7e34167dcc6d6d8c","method":"Microsoft.Azure.ServiceBus.Core.MessageReceiver+<ReceiveAsync>d__62.MoveNext","level":13,"line":0},{"assembly":"System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e","method":"System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw","level":14,"line":0},{"assembly":"System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e","method":"System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification","level":15,"line":0}

and
{"assembly":"Microsoft.Azure.ServiceBus, Version=4.1.1.0, Culture=neutral, PublicKeyToken=7e34167dcc6d6d8c","method":"Microsoft.Azure.ServiceBus.Core.MessageReceiver+<OnReceiveAsync>d__86.MoveNext","level":0,"line":0},{"assembly":"System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e","method":"System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw","level":1,"line":0},{"assembly":"System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e","method":"System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification","level":2,"line":0},{"assembly":"Microsoft.Azure.ServiceBus, Version=4.1.1.0, Culture=neutral, PublicKeyToken=7e34167dcc6d6d8c","method":"Microsoft.Azure.ServiceBus.Core.MessageReceiver+<>c__DisplayClass64_0+<<ReceiveAsync>b__0>d.MoveNext","level":3,"line":0},{"assembly":"System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e","method":"System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw","level":4,"line":0},{"assembly":"Microsoft.Azure.ServiceBus, Version=4.1.1.0, Culture=neutral, PublicKeyToken=7e34167dcc6d6d8c","method":"Microsoft.Azure.ServiceBus.RetryPolicy+<RunOperation>d__19.MoveNext","level":5,"line":0},{"assembly":"System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e","method":"System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw","level":6,"line":0},{"assembly":"Microsoft.Azure.ServiceBus, Version=4.1.1.0, Culture=neutral, PublicKeyToken=7e34167dcc6d6d8c","method":"Microsoft.Azure.ServiceBus.RetryPolicy+<RunOperation>d__19.MoveNext","level":7,"line":0},{"assembly":"System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e","method":"System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw","level":8,"line":0},{"assembly":"System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e","method":"System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification","level":9,"line":0},{"assembly":"Microsoft.Azure.ServiceBus, Version=4.1.1.0, Culture=neutral, PublicKeyToken=7e34167dcc6d6d8c","method":"Microsoft.Azure.ServiceBus.Core.MessageReceiver+<ReceiveAsync>d__64.MoveNext","level":10,"line":0},{"assembly":"System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e","method":"System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw","level":11,"line":0},{"assembly":"System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e","method":"System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification","level":12,"line":0},{"assembly":"Microsoft.Azure.ServiceBus, Version=4.1.1.0, Culture=neutral, PublicKeyToken=7e34167dcc6d6d8c","method":"Microsoft.Azure.ServiceBus.Core.MessageReceiver+<ReceiveAsync>d__62.MoveNext","level":13,"line":0},{"assembly":"System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e","method":"System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw","level":14,"line":0},{"assembly":"System.Private.CoreLib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e","method":"System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification","level":15,"line":0}

@DorothySun216
Copy link
Contributor

@tharidu thanks so much for reporting this. We already created an work item to track this. But due to our bandwidth and the fact that this is not a blocking error, it might take some time for investigation since we need to focus on high pri issues for now. I would recommend treat this as transient error for now and add retry mechanisms to deal with this. And I will update as soon as we figure out anything for a fix. Thanks.

@mr-davidc
Copy link
Author

@tharidu Do you mind if I ask what version of Kubernetes your AKS cluster is running?

@tharidu
Copy link

tharidu commented Aug 11, 2020

@DorothySun216 Thanks for the reply (y)
@mr-davidc We are using K8s version 1.15.10 at the moment in two clusters and experiencing these exceptions in both.

@mr-davidc
Copy link
Author

Ok thanks @tharidu. I was wondering if upgrading our cluster version might help in resolving the issue but I guess not.

@clundy-columbia
Copy link

hello, we have also experienced the same issue and opened an incident with microsoft. While not blocked, it does cause rework due to retries and failures to dead letter. This affects several of us across many integrations.

@DorothySun216
Copy link
Contributor

We have rolled out a fixed #17023 on latest release 5.1.0 and can you test if with this new nuget package, are you still seeing the same issue? https://www.nuget.org/packages/Microsoft.Azure.ServiceBus/5.1.0

@DorothySun216
Copy link
Contributor

One customer is still seeing this issue after upgrading to 5.1.0. There is a Singleton concurrent opening & closing bug that got fixed in the AMQP library 2.4.9 we depend on, which might affect the connection close problem in the case. Can you upgrade to Nuget dependency 5.1.1, https://www.nuget.org/packages/Microsoft.Azure.ServiceBus/ to see if the tests are passing?

@keodime
Copy link

keodime commented Mar 19, 2021

We have the same kind of problem and it's a huge blocker for us.
We have a ServiceBusQueueTrigger that raise message on another queue for an other function and they run on a consumption plan.
The other queue is read by the same kind of function, but that doesnt raise any messages, it just consume them.
The second function doesn't seems to have any problem over few millions messages, the first one though over 200 messages crash big time. It's always the same thing :
Message processing error (Action=Receive, ...)
Locally I can even see ServiceBus TimeoutException 00:59: ....
Which there doesnt seems to be anything to configure about that.
We setup in host.json these settings:
"maxAutoRenewDuration": "00:10:00"
"functionTimeout": "00:10:00"
But even then message keep replaying... most of them pass, but we always finish with deadletters and its raising hundred thousand of message that slow down are process and cost way more than it should.

We are using :
Microsoft.Azure.ServiceBus 5.1.2
Microsoft.Azure.WebJobs.Extensions.ServiceBus 4.2.1

@ramya-rao-a
Copy link
Contributor

Thanks for the updates @DorothySun216!

This issue has gone stale over time.

Since we havent heard back from the original set of issue reporters after @DorothySun216 released updates for the Microsoft.Azure.ServiceBus package, we are going to assume that the problem has been resolved. If not, please log a new issue

@keodime We see your comment after @DorothySun216 made her post on the fixes made. Since it has been 5 months from the time you reported issues, can you confirm if you are still having the same problem? If so, please log a new issue and we can assist as needed.

@github-actions github-actions bot locked and limited conversation to collaborators Mar 28, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Client This issue points to a problem in the data-plane of the library. customer-reported Issues that are reported by GitHub users external to the Azure organization. needs-team-attention Workflow: This issue needs attention from Azure service team or SDK team question The issue doesn't require a change to the product in order to be resolved. Most issues start as that Service Attention Workflow: This issue is responsible by Azure service team. Service Bus
Projects
None yet
Development

No branches or pull requests

9 participants