How many Azure DevOps build agents do you need?

An important key to being able to automate your Business Central builds is to have a build agent, which will run you build pipeline and create your container, compile your app, publish it and run the automated tests etc.

When you use Azure DevOps, then you have the option of either using the Microsoft hosted build agents or create your own agents.

Using Microsoft's standard agents is easy and free, depending on how many minutes you use per month. Creating Business Central containers requires many GB download and because it has to do it every time you run a build, then it becomes so slow that it isn't as good as it sounded.

The time it takes to complete a build is very important. Generally, a build may never take more than 15 minutes, from the developer commits a change (or create a pull request), until the build is complete, and the test results are known. If it takes more time, then the developer may already be deep into the next task and loses vital momentum.

Using your own Windows build agents, either on-premise or in Azure, is in my opinion your best option, when it comes to Business Central. It is as simple as setting up a Windows virtual machine, installing Docker and the build agent. That part can be done via https://aka.ms/getbuildagent or you can do it you self (https://docs.microsoft.com/en-us/azure/devops/pipelines/agents/v2-windows?view=azure-devops).

How many agents do you need?

Depending on the size of your team and how many commits they do per, and how fast your build is, then one may not be enough. If your team just have a few developers, then one agent may be enough, especially if your build is fast. But you may also realize that this agent quickly will become a bottleneck. While the actual build time may be ok, if you have to wait in queue for a few builds to complete before you then you may setup need another build agent.

That was the situation we were in a few months ago. Initially we had one, but to fight the build bottleneck, then this grew until we had 5 build agents running in Azure. Most of them were only used during "peak hours" like when the sprint end was getting closer! Sure we could go into Azure and manually turn on and off the agents, but when they came up again they were mostly outdated (old BC images).

Azure virtual machine scale set agents

That's when we became aware of the new Azure DevOps option: Azure virtual machine scale set agents, which was released only a few months ago.

A VM scale set is a set of VM's that you can spin up, when you need them and remove them again, when you no longer need them. In Azure DevOps you can define how many you need stand-by and for how long it should keep it alive.

 

Just as creating a normal build agent is easy, then creating a VM scale set is almost as easy. At least if all you need is a standard Windows image with Docker.

Creating Business Central VM Scale Set Agents

We needed it to already have Docker installed, so we started out by hijacking the Azure ARM template created by Freddy/Microsoft to create the BC Build Agent (https://aka.ms/getbuildagent).

With this and a few more files, then we got our VM scale set agents in the air. These machines were installed with Docker and BcContainerHelper (just as the BC build agents). The first builds took 15-20+ minutes longer than normal for each version of BC. This meant that we had to raise number of machines to keep standby and how long to keep them alive, so that we were not to severely hit by the waiting time.

To avoid this, we added a PowerShell script to run the New-BcImage CmdLet from BcContainerHelper. This way our scale set images was made including Docker images for the 3-4 BC versions we are working on in our project. Whenever the versions we use changes, we update a json file. Changing it will automatically run the VM scale set build pipeline, which builds (bakes) the VM scale set image and the Docker images and deploys them to Azure. We also run the build when we update Windows or Microsoft changes something else causing BcContainerHelper to rebuild the images automatically (like when the BC artifact storage url was changed last week). As a minimum once a week to get the newest version of the PowerShell modules etc. The bake takes approx. 1 hour, including the 3-4 versions, and after that all new agents have the new image.

With this in place, we were able to lower the number of machines to one and standby time to 15 minutes. It now only takes 3 to 6 minutes, from Azure DevOps requests a new build agent (if no agents are standby), before the container in the new agent is ready. What isn't there to love?

We have a quite large development team, and we keep one standby, which is a huge saving compared to having to pay for 5 VM's. There is also a fee for storing the scale set images, but the cost very low. For a smaller team with less automated builds, then it would not be a problem to lower the number of standby agents to zero and remove the machines again after just 10-15 minutes (or right away). In fact, we are considering the same.

If anyone is interested in learning how to setup your own Azure VM scale set agents, then considering follow-up post which will explain all steps you have to take. I may even share the repository we use to build the agents. But please give me a few days. This post is my first technical blog post for years and the first in long time otherwise. I'm working on the "demo" part, but I just didn't want this post to end in the bottom of the drawer, as so many other, so I'm posting it now.

Comment List
Parents
No Data
Comment Children
No Data
Related
Recommended