It's been a while that I wrote on NAV and automated tests. In the meanwhile I have been teaching a lot all over Europe, and, yes, advocating test automation in NAV. And what did you do since my last post How-to: Run Standard Tests against Your Code? Did you dare and try? And did you also have the time/guts to continue with it? I know some that did.
In the same meanwhile I succeeded to get MS listen and implement a major request I had posed already some time ago. We'll get to that below as it relates to the topic I would like to talk about here: Test Fixture.
If you have been testing software, and probably you do being in my audience, you know that a major part of your testing effort is getting data prepared to allow you to execute your tests. Having been a tester I dare to claim it is THE major part. Test Fixture is the general term for the setup up your system before you execute your test(s), be it software or hardware.
With respect to NAV test automation I have adopted the three level approach Bas Graaf used in his NAV Tech Days presentation ages ago, which he based on xUnit Patterns:
Applying this to NAV:
When I started the project to get the Dynamics NAV Test Toolkit run on our code as per How-to: Run Standard Tests against Your Code, it was giving this result:
23 % percent of a total of 16.124 NAV 2016 tests where successful. Using a statistical approach, tackling the most occurring issues first, I could raise this to 72 % in less than a week. And mainly by focusing on the lazy setup:
By creating one generic Initialize function of our own:
IF NOT IsInitialized THEN BEGIN
IsInitialized := TRUE;
... that I hooked into the local Initialize function of each test codeunits, like in the following example in COD134126 (ERM Reversal VAT Entries):
IF IsInitialized THEN
IsInitialized := TRUE;
LibrarySetupStorage.Save(DATABASE::"General Ledger Setup");
What the h*ck, lazy setup? Sho-go-nai as the Japanese say, as I learned form Mark Brummel: Nothing can be done about it. Yes, I had to modify 480 test codeunits (of the total of 628) to get this working! And I told MS that I had to. This could have been prevented by a better design of there Initialize function from the start. And yessss, MS listened and now with NAV 2018 CU3 a redesign has been implemented in the first 100 test codeunits:
LibraryTestInitialize.OnTestInitialize(CODEUNIT::"ERM Apply Sales/Receivables");
IF isInitialized THEN
LibraryTestInitialize.OnBeforeTestSuiteInitialize(CODEUNIT::"ERM Apply Sales/Receivables");
isInitialized := TRUE;
LibrarySetupStorage.Save(DATABASE::"General Ledger Setup");
LibraryTestInitialize.OnAfterTestSuiteInitialize(CODEUNIT::"ERM Apply Sales/Receivables");
where LibraryTestInitialize is COD132250 (Library - Test Initialize) contains these three event publisher to which we now can subscribe:
[External] [IntegrationEvent] OnTestInitialize(CallerCodeunitID : Integer)
[External] [IntegrationEvent] OnBeforeTestSuiteInitialize(CallerCodeunitID : Integer)
[External] [IntegrationEvent] OnAfterTestSuiteInitialize(CallerCodeunitID : Integer)
So getting your lazy setup done on standard tests is a no-brainer. Another threshold taken to get test automation going.
The other day I wanted to get my local NAV 2018 installation fully working. Fully local, including VSCode. And ... setup an extra database with its own service tier.
In the end it was quite easy, not in the least to some very helpful resources (see below), but I ran into a couple of other issues I needed to get fixed. It might be of help to you. Knowing I will typically forget the details, it surely will to be me in the near future. But take advantage of it, if needed. And if you have other details to share be my guest below.
The following steps are in the order in which I tried to get started with VSCode. Typically some steps could be in changed order.
Of course both of these needs to be installed first. Note that the VSCode installation is not part of the Dynamics NAV 2018 installation. Be sure to have the installation option Modern Development Environment selected.
Just for names sake let's call the database My CRONUS and the service tier MyCRONUS110. Port numbers 7145, 7146, 7147, 7148 and 7149.
I.e. get VSCode and NAV connected. Follow Erik Hougaard's post to get that done. Thanx, Erik.
Changing the Enable Developer Service Endpoint needs a service tier restart, but this failed.
A service tier failing to start often has to do with the fact the either (1) the service account is not dbowner on the datbase it is servicing or (2) the URL's for the web services ports have not been registered. As the service tier had been running with no problem before I enabled the Developer Service Endpoint it probably needed the URL registered for Developer Service Endpoint: For this Arend-Jan Kauffmann's August 28, 2014 post did come to rescue me, as so many times before.
netsh http show urlacl
... did show it was not registered. So I executed the following command:
http add urlacl url=http://+:7149/MyCRONUS110/ user= "NT AUTHORITY\NETWORK SERVICE" listen=yes delegate=no sddl="D:(A;;GX;;;NS)"
Yoohoo, the service did start.
I am not fully sure, but it seems that developer service endpoint port URL is often not automatically registered.
Yesss, I opened VSCode and did create a new AL project using our future spell AL: Go!
One of the first things to do is setup the launch.json. BTW: calling AL: Go! will urge you to fill out a number of things in the launch.json. A very useful help is another post by Arend-Jan Kauffmann, on setting the
Web client URL? Did I have that? Nope, because I did a standard installation and did no have one setup for my new database My CRONUS and service tier MyCRONUS110. So, let's ...
Based on this msdn post I created a new web server instance with
New-NAVWebServerInstance -WebServerInstance MyCRONUS110 -Server MyLaptop -ServerInstance MyCRONUS110
And to make sure it accesses the right client port I updated the Microsoft.Dynamics.Nav.Client.ClientService.dll.config:
<add key="ClientServicesPort" value="7146" />
and the navsettings.json:
Well, I didn't just do it like that straight away, to be honest. I did one after the other, in hindsight getting closer and closer to the solution. With each of these changes I restarted the IIS service.
... and it worked. The web client opened up showing, when click on the Customers link the message as programmed in the HelloWorld.al on the OnOpenPage trigger on Page 22.
With a standard (demo) installation, that includes the Web Server Components and the Modern Development Environment options, many of the above is already setup.
The other day I had to update the objects I use in my training, from 2017 and 2018. Not a lot altogether, thus easy to do this manually and have a look at the changes.
I was glad to have to do this as it showed that a number of fields have been stretched. Fields we typically use in our solution, be it the standard fields themselves or clones of them on our own objects.
So next to the code merge from 2016 to 2018 we were starting on, we were to update our code according to these findings.
You might recall my post on similar changes in NAV 2013, coming from NAV 2009 R2, showing over 450 fields that had been stretched. This time it seems to even be a bigger operation. The current status of my research show me over 800 fields that have been stretched. Mainly code fields stretched from 10 to 20. The following gives a consolidated overview of what I could categorize some way.
Type of Field
# of Changes
Bank Account Code
10 -> 20
Business Unit Code
No. Series Code
Posting Group Code
Tax Group Code
VAT Clause Code
Vendor ... No.
20 -> 35
Note that the Type of Field column contains either the exact field name (e.g. Bank Account Code) or a kind of consolidated name (e.g. Posting Group Code as there are various fields with different names that refer to a posting group). For the complete overview and details download this Excel file. You will find on the fields tab some 44 more stretched fields that I didn't categorize yet.
Let's confess before I start: getting standard tests to run against your code is by no means rocket science. However, I dare say many peers out there haven't even touched the Test Toolkit, seemingly looking at it as an unbearable threshold. Too much work. Of no relevance to their code. Unknown = unloved (as my mother used to say).
Clearly the majority of the audience I have been speaking to on the Test Automation Suite in the last year and a half, never laid hands on it. I hope I did trigger some of them though, to get started. And if they, and you, still have this cold water fear to get started … this post will show you there's no reason not to. While showing how easy it is, I will also touch on some backgrounds; of the Test Toolkit and test automation in general.
Now what do we need?
Needless to mention that all resources should be based on the same version. In our case we are still running on NAV 2016: code on RTM (build 42815), executables on CU11 (build 46773). And BTW: no direct need for any fancy hardware, virtualized or not. My first steps were done on my laptop!
O yes, this CRONUS database. Why are we using a CRONUS database? Probably like you, many participants in my sessions wondered. Why not run it on our own database, containing our data: setup, master data, ledger entries, …?
One of the principles in testing is to have a common baseline when (re)executing tests. To give the best guarantee that a test is reproducible. A in, B out. Every time again. No data created in between two runs of the same test (set) that results in a different outcome. Well, that's not a ground vast argument for using CRONUS, right?
However, another principle is that each test (set) creates its own data needed during execution. And ideally the full Monty, meaning from scratch. Every time again, making a test (set) independent of any data already residing in the system … ideally.
As with many things in live it's not always as ideal as we would like, or as we say. This also applies to the Test Automation Suite.
Yes, the majority of the tests create their own data on the fly, like customers, sales documents, ledger entries, etc. But some also rely on the data readily available in CRONUS, like Sales & Receivables Setup or VAT Posting Setup, data - configuration data - that is used in most relevant tests as-is. I guess this also relates to the fact that the Test Automation Suite is mainly executing integration tests. Thus needing to use an complex set of data. To avoid to have to recreate each part of this set with each execution of a test (set), I guess, MS has chosen to make use of some data readily available in CRONUS and making the test execution faster.
So on one hand, having test create their own data, it is of no importance in what database we execute them. On the other, as described, there is some dependency on data in CRONUS and so we are using CRONUS for our tests.
Enough about this baseline, for now …
Or at least the test tool should be executing the first test codeunit, 134000 - ERM Apply Sales/Receivables. And of course continue with all the others. In our case this very first run took approx. 1 hour. Which is quite fast as I had ran the standard Test Suite already on standard CRONUS taking approx. 2,5 hours. That it only took an hour was directly due to the fact that only a minority, 23%, of the tests passed and therefor the execution of all tests was much shorter.
But more later, on the results and how we went about getting the failures succeed.
That's it for now and next time don't tell me you still haven't tried to "Run Standard Tests against Your Code".
With our upgrade to NAV 2016 from NAV 2009 R2 many new things came available to us at The Learning Network, formerly known as Van Dijk Educatie. Technical improvements like the performance of the service tier, which was one of the major reasons for wanting to leave behind NAV 2009. But of course, next to a vast number of functional features, also the availability of PowerShell cmdlets, upgrade codeunits and test automation. Those who have been following me one way or another know that the latter subject has my special attention: testing and test automation.
In today's post I would like to make a start in sharing our approach and findings in how we started to use the Test Toolkit as being released by MS on each NAV product DVD since NAV 2016. For that I have setup a kind of framework below that allows me refer back to some of its parts when elaborating on it more in posts to come.
So here we go, fasten your seatbelts and stay tuned. And … do not be afraid to ask.
Our primary goal of this exercise was to setup a test collateral, based on the standard MS application tests, to be used as a regression test suite, by
The basic plan to achieve this was to:
With any endeavor there are always a number of assumptions. Or should I say loads?
Well, basically we had these two, being that all MS tests …
This is the world we live in at The Learning Network:
To continued ...