Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Getting Microsoft.Azure.ServiceBus.ServiceBusCommunicationException: The messaging entity 'evt-scu-prod-bus:Topic:publish_atlas|node1000000_4b9c90e9-c2ec-49f9-b295-d392f554faeb' could not be found. #7907

Closed
schwartzma1 opened this issue Oct 3, 2019 · 9 comments
Assignees
Labels
bug This issue requires a change to an existing behavior in the product in order to be resolved. customer-reported Issues that are reported by GitHub users external to the Azure organization. Service Attention Workflow: This issue is responsible by Azure service team. Service Bus

Comments

@schwartzma1
Copy link

schwartzma1 commented Oct 3, 2019

Describe the bug
We are on Microsoft.Azure.ServiceBus 3.4.0 and our services were already running / we were receiving messages and then without any services starting / stopping or any configuration changes we are aware of we started getting this exception repeatedly that the Messaging Entity could not be found.

Exception or Stack Trace
Microsoft.Azure.ServiceBus.ServiceBusCommunicationException: The messaging entity 'evt-scu-prod-bus:Topic:publish_atlas|node1000000_4b9c90e9-c2ec-49f9-b295-d392f554faeb' could not be found. TrackingId:a88a3a8c-2400-428a-b3a6-efc6e3498b2d_B11, SystemTracker:evt-scu-prod-bus:Topic:publish_atlas|node1000000_4b9c90e9-c2ec-49f9-b295-d392f554faeb, Timestamp:2019-10-03T16:19:29 TrackingId:da260de14ab94faa877f87954a2cd36f_G4, SystemTracker:gateway5, Timestamp:2019-10-03T16:19:29
at Microsoft.Azure.ServiceBus.Core.MessageReceiver.d__86.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at Microsoft.Azure.ServiceBus.Core.MessageReceiver.<>c__DisplayClass64_0.<b__0>d.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at Microsoft.Azure.ServiceBus.RetryPolicy.d__19.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at Microsoft.Azure.ServiceBus.RetryPolicy.d__19.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at Microsoft.Azure.ServiceBus.Core.MessageReceiver.d__64.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at Microsoft.Azure.ServiceBus.Core.MessageReceiver.d__62.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at Microsoft.Azure.ServiceBus.MessageReceivePump.d__11.MoveNext()

AsyncTaskMethodBuilder.Start => d__21.MoveNext => AppConfigCache.ExceptionReceivedHandler

To Reproduce
Not sure of exact reproduction steps, however I am attaching code sample below and a stack of the error. Also creating subscriptions to this topic / deleting subscriptions from our Azure subscription give errors such as the following screensho with InternalServerError: An error occurred while sending the request.

image (11)

Code Snippet
Our setup of the subscription looks like this:

TokenProvider credentials = TokenProvider.CreateSharedAccessSignatureTokenProvider(sbConfig.AccessKeyName, sbConfig.AccessKeyValue);
ServiceBusConnectionStringBuilder connectionBuilder = new ServiceBusConnectionStringBuilder(sbConfig.ConnectionString);
_managementClient = new ManagementClient(connectionBuilder, credentials);

            bool isSubscriptionExists = _managementClient.SubscriptionExistsAsync(ServiceBusConstants.TopicPublishAtlas, _subscriptionName).Result;
            if (!isSubscriptionExists)
            {
                try
                {
                    SubscriptionDescription sd = new SubscriptionDescription(ServiceBusConstants.TopicPublishAtlas, _subscriptionName)
                    {
                        DefaultMessageTimeToLive = ServiceBusConstants.DefaultTTL,
                        LockDuration = ServiceBusConstants.LockDuration,
                        MaxDeliveryCount = ServiceBusConstants.MaxDeliveryCount_NoRetry,
                        AutoDeleteOnIdle = ServiceBusConstants.AutoDeleteOnIdle
                    };

                    CorrelationFilter correlationFilter = new CorrelationFilter();
                    correlationFilter.Properties.Add(ServiceBusConstants.SBPropNameObjectType, typeof(SysConfig).FullName);

                    Task task = Task.Run(async () =>
                    {
                        await _managementClient.CreateSubscriptionAsync(sd, new RuleDescription("default", correlationFilter));
                    });
                    task.Wait();
                }
                catch (Exception ex)
                {
                    AppLogger.Write(null, TraceEventType.Error, _fullName, nameof(Initialize), ex.ToString());
                }
            }
            _subscriberClient = new SubscriptionClient(sbConfig.ConnectionString, ServiceBusConstants.TopicPublishAtlas, _subscriptionName, ReceiveMode.PeekLock);

            var messageHandlerOptions = new MessageHandlerOptions(ExceptionReceivedHandler)
            {
                MaxConcurrentCalls = ServiceBusConstants.MaxConcurrentCalls,
                AutoComplete = false
            };
            _subscriberClient.RegisterMessageHandler(ProcessMessages, messageHandlerOptions);
            #endregion

`

Our message handlers look like:

`
#region ExceptionReceivedHandler
private Task ExceptionReceivedHandler(ExceptionReceivedEventArgs exceptionReceivedEventArgs)
{
AppLogger.Write(null, TraceEventType.Error, _fullName, nameof(ExceptionReceivedHandler), exceptionReceivedEventArgs.Exception.ToString());

        _isConnected = false;
        return Task.CompletedTask;
    }
    #endregion

    #region ProcessMessages
    private async Task ProcessMessages(Message message, CancellationToken token)
    {
        try
        {
            _isConnected = true;
            string objectType = (string)CommonDataFunctions.DictionaryAtOrDefault(message.UserProperties, ServiceBusConstants.SBPropNameObjectType, "");
            if (objectType != typeof(SysConfig).FullName)
            {
                await _subscriberClient.DeadLetterAsync(message.SystemProperties.LockToken);
                return;
            }

            string messageBody = Encoding.UTF8.GetString(message.Body);
            SysConfig changedObject = JsonConvert.DeserializeObject<SysConfig>(messageBody);
            _appConfigCacheLock.EnterWriteLock();
            try
            {
                // We do not know the type of configuration element to deserialize to from key value,
                // but the next time the element is looked up it will be deserialized correctly by its accessors.
                // This also has the benefit of using the local config file for any overrides.
                _configurationElements.Remove(changedObject.Id);
            }
            finally
            {
                _appConfigCacheLock.ExitWriteLock();
            }

            await _subscriberClient.CompleteAsync(message.SystemProperties.LockToken);
        }
        catch (Exception)
        {
            await _subscriberClient.AbandonAsync(message.SystemProperties.LockToken);
        }
    }
    #endregion

Expected behavior
We would expect messaging entity not found error to not have happened in this case since this topic does exist and we did not change any configuration.

Setup (please complete the following information):

  • OS: Windows / Azure
  • IDE : Visual Studio 2019
  • Version of the Library used : Microsoft.Azure.ServiceBus 3.4.0

Additional context
This has been running mostly without issue in production for the past few months. We may have seen an exception or so like this before but now they are happening repeatedly.

Information Checklist
Kindly make sure that you have added all the following information above and checkoff the required fields otherwise we will treat the issuer as an incomplete report

  • [x ] Bug Description Added
  • [x ] Repro Steps Added
  • [x ] Setup information Added
@triage-new-issues triage-new-issues bot added the needs-triage Workflow: This is a new issue that needs to be triaged to the appropriate team. label Oct 3, 2019
@schwartzma1
Copy link
Author

schwartzma1 commented Oct 3, 2019

Seems like we are actually not able to create the subscription. The ManagementClient CreateSubscriptionAsync returns A Task was canceled with a stack trace of the following.

Also happens with Microsoft.Azure.ServiceBus 4.0 nuget package.

at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Net.Http.HttpClient.d__58.MoveNext()
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at Microsoft.Azure.ServiceBus.Management.ManagementClient.d__50.MoveNext()
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at Microsoft.Azure.ServiceBus.Management.ManagementClient.d__48.MoveNext()
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at Microsoft.Azure.ServiceBus.Management.ManagementClient.d__31.MoveNext()
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Runtime.CompilerServices.TaskAwaiter`1.GetResult()
at Eventellect.BusinessObjects.AppConfig.AppConfigCache.<>c__DisplayClass11_0.<b__0>d.MoveNext() in

This is happening at:
wait _managementClient.CreateSubscriptionAsync(sd, new RuleDescription("default", correlationFilter));

@loarabia loarabia added Client This issue points to a problem in the data-plane of the library. customer-reported Issues that are reported by GitHub users external to the Azure organization. Service Attention Workflow: This issue is responsible by Azure service team. Service Bus labels Oct 3, 2019
@triage-new-issues triage-new-issues bot removed the needs-triage Workflow: This is a new issue that needs to be triaged to the appropriate team. label Oct 3, 2019
@ghost
Copy link

ghost commented Oct 3, 2019

Thanks for the feedback! We are routing this to the appropriate team for follow-up. cc @jfggdl

1 similar comment
@ghost
Copy link

ghost commented Oct 3, 2019

Thanks for the feedback! We are routing this to the appropriate team for follow-up. cc @jfggdl

@jfggdl jfggdl added bug This issue requires a change to an existing behavior in the product in order to be resolved. and removed Client This issue points to a problem in the data-plane of the library. labels Oct 4, 2019
@jfggdl
Copy link

jfggdl commented Oct 4, 2019

@schwartzma1, Thank you for raising this issue. Please provide the latest correlation id related to your problem if that is not the one found on the screenshot provided at the top. Another good channel for taking care of this kind of issues is creating a support request on the Azure Portal. Have you tried that? If so, please share the case id number. Thanks.

@jfggdl
Copy link

jfggdl commented Oct 4, 2019

@schwartzma1, would you please create a support request using the Azure Portal? It would be the most quicker approach for solving your issue. Please include the correlation id in the support request. Thanks.

@schwartzma1
Copy link
Author

@jfggdl Yes we are creating a support request. Someone else on my team has created the ticket so I am not sure the details but I will post case number etc. if I can get that next week and post any resolution related to this issue.

@nemakam
Copy link
Contributor

nemakam commented Oct 5, 2019

@schwartzma1
Just FYI: After PR #7942 is published, instead of ServiceBusCommunicationException, you'll start getting MessagingEntityNotFoundException

@jfggdl
Copy link

jfggdl commented Oct 7, 2019

@schwartzma1, thanks. You are in good hands with Azure Support for this kind of issues. I will be closing this issue now, but please share the support request number and I will monitor its progress. We want you to have an answer to your core issue reported.

@jfggdl jfggdl closed this as completed Oct 7, 2019
@ghost
Copy link

ghost commented Oct 7, 2019

Thanks for working with Microsoft on GitHub! Tell us how you feel about your experience using the reactions on this comment.

@github-actions github-actions bot locked and limited conversation to collaborators Mar 29, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
bug This issue requires a change to an existing behavior in the product in order to be resolved. customer-reported Issues that are reported by GitHub users external to the Azure organization. Service Attention Workflow: This issue is responsible by Azure service team. Service Bus
Projects
None yet
Development

No branches or pull requests

5 participants