Tuesday, June 29, 2021

Writing a PowerShell cmdlet in C# for Azure Service Bus entities.

 

Problem statement: Azure Service Bus is a Microsoft public cloud resource that is the equivalent of Message Brokers such as RabbitMQ, ZeroMQ and others that implement the Advanced Message Queuing (AMQP) protocol. The entities of the Azure Service Bus include queues and topics. A queue allows a producer and consumer to send and receive messages in first-in-first-out order. A topic allows the messages sent by one or more producers to be distributed to several subscriptions where subscribers are registered and receive those messages.  This article describes some of the issues encountered when writing a PowerShell cmdlet for a migrating a service bus entity and their messages to another service bus.

Description: PowerShell is the language of automation especially for custom logic that does not have any support from the features or built-ins available directly from a resource. Usually, custom logic is reduced to a minimum but resources like products are focused on improving their offerings rather than their integration. Automation to migrate one service bus to another required the use of a built-in feature named geo-replication of ServiceBus. Geo-replication has two limitations. First, it replicates only the structure and not the content of the Service Bus entities. Second, it requires the source and target Service Bus to be hosted in different geographic regions that is quite unlike some of the other Azure resources. The latter can be overcome by replicating across one region and then replicating back to the same region. But it still does not resolve the message replication which is critical to have the second instance be like the first.

This calls for a custom program that implements the enumeration of service bus entities from one instance to another which is available from the SDK for NamespaceManager. The messages from these entities can be read with the help of the Software Development Kit (SDK) for Azure Service Bus. The trouble with these independent SDKs is that they require different versions of dependencies. One such dependency is the Windows.Powershell.5.reference assembly and the namespace manager has a specific version requirement for the assembly version of 4.0.4.1 while the Azure service bus SDK requires the use of much more recent versions. Changes to the program to allow for the use of the lowest common denominator in terms of versions is not a viable option because there is none. 

The error encountered appears something like this:

Azure.Messaging.ServiceBus.ServiceBusException: Could not load file or assembly 'System.Runtime.CompilerServices.Unsafe, Version=4.0.4.1, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a' or one of its dependencies. The system cannot find the file specified. (ServiceCommunicationProblem) ---> System.IO.FileNotFoundException: Could not load file or assembly 'System.Runtime.CompilerServices.Unsafe, Version=4.0.4.1, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a' or one of its dependencies. The system cannot find the file specified.

Any binding redirect issued in the Application configuration to require runtime dependencies to be bound to the higher version also does not mitigate this error.

In addition, target frameworks have been changed from netstandard2.1 to netcoreapp3.1 and finally net48, which results in other kinds of assembly errors. These include:

FileNotFoundException: Could not load file or assembly 'Microsoft.Bcl.AsyncInterfaces, Version=1.0.0.0, Culture=neutral, PublicKeyToken=cc7b13ffcd2ddd51' or one of its dependencies. The system cannot find the file specified. 

And System.AggregateException, HResult=0x80131500,  Message=One or more errors occurred. and  Source=mscorlib

Each of those framework options results in a lot of trial and errors. The Cmdlet also has dependencies other than the two mentioned above and any changes to the versioning also changes their versions.

Conclusion: Registering the required assembly to the Global Assembly Cache via offline mechanisms enables us to get past the error and although it is not a great experience, it enables the automation to run.


No comments:

Post a Comment