Why you should start considering Azure Batch for some of your Dynamics 365 Business Central cloud workloads

At the Dynamics 365 Business Central 2021 Wave 1 Launch event, in his wonderful session about the Administration Center, Dmitry Chadayev (Microsoft PM) has described the improvements that Microsoft has made on the tenant’s database export feature. The improvements involved the underlying technology for having a more reliable and performant way of exporting the tenant database when this database grows in size during time.

Database export was previously handled by using standard database export feature from Azure SQL, but now for handling this task in a more reliable way there’s a brand new service under the hoods that handles the export of these databases. This new service uses Azure Batch service for handling the export tasks:

As you can see in the above image, on Microsoft’s datacenters there’s now an instance of Azure Batch service deployed on different regions that handles the export requests. The data never leaves the region where the customer environment is created.

This service permits to create a pool of virtual machines (that can scale) and these pools handles the export tasks when needed (and with the needed resources).

That’s really cool I think!

But do you know that Azure Batch service can also be helpful for you and your customers for handling your workloads in the cloud?

Azure Batch permits you to run large-scale parallel and high-performance computing (HPC) batch jobs efficiently in Azure. Azure Batch creates and manages a pool of compute nodes (virtual machines), installs the applications you want to run, and schedules jobs to run on the nodes. There’s no cluster or job scheduler software to install, manage, or scale. The Azure Batch service can automatically scale the cloud infrastructure /environment from a single node to thousands of virtual machines nodes instantly, on a scheduled time /date or on demand.

If you have customers in the cloud, using Azure Batch for handling your workloads is a great and viable solution on many scenarios. I think that you already know that one of my mantra is “leave the ERP doing the ERP” and I always suggest to partners to avoid writing too much external calls or complex integrations in AL language and instead use external tools. I want to report here a real world example that maybe can be helpful also for your projects.

Imagine to have a Dynamics 365 Business Central project where you need to handle tasks like:

  • massive data export
  • massive document generation for exporting documents to an external document management system
  • interactions with systems like production machines and so on

Handling these tasks from AL code and then calling an external service can have two main problems:

  • impact on performances of the tenant
  • reliability of the export

For such tasks, I normally prefer to have an external service that interacts with Business Central APIs and then it does the needed process. Tenant performances and user experience is not too much affected in this way. But sometimes an external service is not enough: you need scalability! And this is where Azure Batch could be helpful.

To explain how to use Azure Batch in practice with Dynamics 365 Business Central, imagine that I have an external service (console applications) that handles some tasks. In this example I have an app with methods for:

  • processing Item Ledger Entries
  • processing G/L Entries
  • processing Warehouse Entries

This processes can require different resources to be completed (some tasks are small, some tasks are very intensive).

You can start by creating a new Azure Batch account in the Azure Portal:

When you create the Azure Batch account, add also a storage account (in the same Azure region as the associated Batch account).

When you have the Azure Batch account provisioned, on the Keys section you can see the batch account credential that you need to use to interact with this Azure Batch instance:

You need to take note of:

  • Name of your batch account
  • URL of the batch account
  • Primary access key

Now, create a ZIP package of your external service application (my console application, with EXE, DLLs, CONFIG files and all you can find in the BIN folder):

When you have the application’s ZIP package, navigate to the Applications section in your Azure Batch account in the Azure Portal and create a new application package (this is the application to install when the nodes join the pool):

To create the Application package, you need to:

  • create an application ID
  • Select a version
  • Upload your application package (ZIP file)

Then, to start using Azure Batch you need to create a pool and then a job that the pool will run. You can create a Batch pool of compute nodes directly from the Azure Portal by clicking on the Pool section:

Here you can select your compute nodes (compute nodes are VMs that execute your tasks) and then you can specify properties for your pool, such as the number and size of the nodes, a Windows or Linux VM image and more.

But the interesting part of Azure Batch is that you can do that programmatically and this is absolutely my personal preferred option. Directly from code you can:

  • create the batch pool with the needed resources (nodes)
  • create the batch job
  • start and execute the jobs
  • terminate the jobs and delete the pool (to reduce costs!)

To use Azure Batch from C# code, you need to add a reference to the Microsoft.Azure.Batch package:

The code that creates the Batch pool is the following:

I’m creating two nodes with a Windows Server 2012 R2 Datacenter and a Standard A1 VM. Then I create a batch pool and then I associate the application package to the batch pool.

In the following code, I create the job and for this job I specify the application to install when the nodes join the pool (%AZ_BATCH_APP_PACKAGE_{appId}#{appVersion}%). Then I execute the task (taskCommandLine variable containes the command line code to execute). Here I’m calling my console application (D365BCTasks.exe) with the required parameters (I need the Dynamics 365 Business Central tenant id and environment name).

The tasks are now added to the job and executed by the Azure Batch service.

Then the code waits for the completion of all the tasks and when completed it returns the task’s output:

As you can see, at the end of the execution I delete the pool and I mark the job as completed.

What happens when you execute this code? A batch pool with two nodes is created on Azure (as specified in my above code):

This pool has a task assigned that now is marked as completed:

The output of my task executed in the cloud is the following (in this example my EXE application prints the number of records in Dynamics 365 Business Central affected by the operation):

All is executed in the cloud in a controlled, scalable and reliable way. Nice, isn’t it?

You can also create a recurring job schedule if you want to execute tasks continuously.

So, what are the main benefits of using Azure Batch for some cloud tasks? Essentially:

  • Your apps can run more efficiently on separate, parallel compute nodes in the Azure cloud
  • You can scale your resources as needed by adding computer nodes and power
  • You can execute your workloads on Windows or Linux nodes.

What about pricing (common question)? Azure Batch pricing is very flexible. There is no charge for the Azure account itself. Batch only charges you for the resources consumed during batch jobs. These include VMs processing, which is charged at Azure’s standard rate, and data/application storage. The pricing structure is pay-as-you-go, with no up-front or termination fees. 

Comment List