More or less coincidentally I stumbled over the topic of this post a couple of weeks ago in my next endeavor to raise the success rate of the MS test suite on our solution. A task being picked up every once in while, next to all the normal work.
The major approach so far in getting as much as possible MS tests working has been a statistical one: look for the most occurring error and get it solved; subsequently take the next most occurring and so on. In 1 week the success rate raised from 23% to 72%. One week more we did get it to almost 80%, i.e. 79%. After six weeks altogether, spread over a couple of months, we hit the 90%. As in many other cases the Pareto principle also didapply here, and thus the remaining 10% will take almost as much time to get it successful. And you could wonder if that's worth the trouble. Well, this blog post could be seen as a proof for taken the trouble still. It has been worthwhile to continue to raise the success rate. And actually I deliberately continue for this very reason as we also started this project: to learn from the automated tests MS did conceive. And in the meanwhile we do get more tests working.
So how did I "stumble over"? Instead of using the statistical approach as pointed out above ("solving the most occurring error") I looked at the left side of my percentage graph (that's where you can find the lower test codeunits that hold the tests that are most meaningful to us; where those on the right are for a bigger part testing production and warehouse management, at this moment of lesser or no use at all to us):
And I focused on the white ditches between the blue. The deeper the lesser successful. And maybe t he easier to get successful.
The one that struck my eye most was the one white all the way down.The narrow grand canyon.Or gorges du Tarn, being a Francophile. Do you see it? Somewhat left of the middle. 0% success in this test codeunit. It turned out to COD134926 (Table Relation Test). To be honest I didn't look at the name, and opened it in design mode to find out there was only one test function. Was I going to put my time on getting this fixed? So I looked at the codeunit name, I looked at the error thrown and studied the code closer.
Table Relation Test
The type of Table 23 in Field 50012 is incompatible with Table 50026 Field 1.
LOCAL ValidateFieldRelation(VAR TempTableRelationsMetadata : TEMPORARY Record "Table Relations Metadata") : Boolean
Aha, it's checking the technical validity of a table relation between source and target field that make up a table relation:
Or as the in code explanation says:
// <Summary> Fields must have the exact length of the largest field they relate to.// <Summary> Fields must have the same type as the fields they relate to. Exception: if it relates to both Code and Text it must be Text
Wow. We have been cleaning up our code a lot in the last year, but hadn't been able to get this tackled, and now we did hit upon this nice gift. It pointed out - ouch, do I dare to say this - ... more than 50 incompatible table relations. Needless to say these were injected in our code long, long ago.
If, after all my previous posts on the test toolkit, you're still looking for reasons to start using it. Now you have surely one (more), be it a small, and very useful one.
Tomorrow exactly one month ago I wrote this post, based on the update in a number of test codeunits in NAV 2018 CU3: Let's talk about Shared Fixture and how to profit from this with the Dynamics NAV Test Toolkit. Just now I finished upgrading our test code to NAV 2018 CU4. It was a lot of work as I had to manually check a lot, but, wow, MS did an even greater job. The redesign introduced in CU3, and implemented on the first 100 test codeunits, now has been continued on an even larger number of test codeunits. Due to this I was able to revert our customization substantially:
# Customized Test Codeunits
NAV 2018 RTM
NAV 2018 CU3
NAV 2018 CU4
Keep the good work going ... (as I have still 200 of 720 customized ;-)
The other day I wanted to get my local NAV 2018 installation fully working. Fully local, including VSCode. And ... setup an extra database with its own service tier.
In the end it was quite easy, not in the least to some very helpful resources (see below), but I ran into a couple of other issues I needed to get fixed. It might be of help to you. Knowing I will typically forget the details, it surely will to be me in the near future. But take advantage of it, if needed. And if you have other details to share be my guest below.
The following steps are in the order in which I tried to get started with VSCode. Typically some steps could be in changed order.
Of course both of these needs to be installed first. Note that the VSCode installation is not part of the Dynamics NAV 2018 installation. Be sure to have the installation option Modern Development Environment selected.
Just for names sake let's call the database My CRONUS and the service tier MyCRONUS110. Port numbers 7145, 7146, 7147, 7148 and 7149.
I.e. get VSCode and NAV connected. Follow Erik Hougaard's post to get that done. Thanx, Erik.
Changing the Enable Developer Service Endpoint needs a service tier restart, but this failed.
A service tier failing to start often has to do with the fact the either (1) the service account is not dbowner on the datbase it is servicing or (2) the URL's for the web services ports have not been registered. As the service tier had been running with no problem before I enabled the Developer Service Endpoint it probably needed the URL registered for Developer Service Endpoint: For this Arend-Jan Kauffmann's August 28, 2014 post did come to rescue me, as so many times before.
netsh http show urlacl
... did show it was not registered. So I executed the following command:
http add urlacl url=http://+:7149/MyCRONUS110/ user= "NT AUTHORITY\NETWORK SERVICE" listen=yes delegate=no sddl="D:(A;;GX;;;NS)"
Yoohoo, the service did start.
I am not fully sure, but it seems that developer service endpoint port URL is often not automatically registered.
Yesss, I opened VSCode and did create a new AL project using our future spell AL: Go!
One of the first things to do is setup the launch.json. BTW: calling AL: Go! will urge you to fill out a number of things in the launch.json. A very useful help is another post by Arend-Jan Kauffmann, on setting the
Web client URL? Did I have that? Nope, because I did a standard installation and did no have one setup for my new database My CRONUS and service tier MyCRONUS110. So, let's ...
Based on this msdn post I created a new web server instance with
New-NAVWebServerInstance -WebServerInstance MyCRONUS110 -Server MyLaptop -ServerInstance MyCRONUS110
And to make sure it accesses the right client port I updated the Microsoft.Dynamics.Nav.Client.ClientService.dll.config:
<add key="ClientServicesPort" value="7146" />
and the navsettings.json:
Well, I didn't just do it like that straight away, to be honest. I did one after the other, in hindsight getting closer and closer to the solution. With each of these changes I restarted the IIS service.
... and it worked. The web client opened up showing, when click on the Customers link the message as programmed in the HelloWorld.al on the OnOpenPage trigger on Page 22.
With a standard (demo) installation, that includes the Web Server Components and the Modern Development Environment options, many of the above is already setup.
The other day I had to update the objects I use in my training, from 2017 and 2018. Not a lot altogether, thus easy to do this manually and have a look at the changes.
I was glad to have to do this as it showed that a number of fields have been stretched. Fields we typically use in our solution, be it the standard fields themselves or clones of them on our own objects.
So next to the code merge from 2016 to 2018 we were starting on, we were to update our code according to these findings.
You might recall my post on similar changes in NAV 2013, coming from NAV 2009 R2, showing over 450 fields that had been stretched. This time it seems to even be a bigger operation. The current status of my research show me over 800 fields that have been stretched. Mainly code fields stretched from 10 to 20. The following gives a consolidated overview of what I could categorize some way.
Type of Field
# of Changes
Bank Account Code
10 -> 20
Business Unit Code
No. Series Code
Posting Group Code
Tax Group Code
VAT Clause Code
Vendor ... No.
20 -> 35
Note that the Type of Field column contains either the exact field name (e.g. Bank Account Code) or a kind of consolidated name (e.g. Posting Group Code as there are various fields with different names that refer to a posting group). For the complete overview and details download this Excel file. You will find on the fields tab some 44 more stretched fields that I didn't categorize yet.
Let's confess before I start: getting standard tests to run against your code is by no means rocket science. However, I dare say many peers out there haven't even touched the Test Toolkit, seemingly looking at it as an unbearable threshold. Too much work. Of no relevance to their code. Unknown = unloved (as my mother used to say).
Clearly the majority of the audience I have been speaking to on the Test Automation Suite in the last year and a half, never laid hands on it. I hope I did trigger some of them though, to get started. And if they, and you, still have this cold water fear to get started … this post will show you there's no reason not to. While showing how easy it is, I will also touch on some backgrounds; of the Test Toolkit and test automation in general.
Now what do we need?
Needless to mention that all resources should be based on the same version. In our case we are still running on NAV 2016: code on RTM (build 42815), executables on CU11 (build 46773). And BTW: no direct need for any fancy hardware, virtualized or not. My first steps were done on my laptop!
O yes, this CRONUS database. Why are we using a CRONUS database? Probably like you, many participants in my sessions wondered. Why not run it on our own database, containing our data: setup, master data, ledger entries, …?
One of the principles in testing is to have a common baseline when (re)executing tests. To give the best guarantee that a test is reproducible. A in, B out. Every time again. No data created in between two runs of the same test (set) that results in a different outcome. Well, that's not a ground vast argument for using CRONUS, right?
However, another principle is that each test (set) creates its own data needed during execution. And ideally the full Monty, meaning from scratch. Every time again, making a test (set) independent of any data already residing in the system … ideally.
As with many things in live it's not always as ideal as we would like, or as we say. This also applies to the Test Automation Suite.
Yes, the majority of the tests create their own data on the fly, like customers, sales documents, ledger entries, etc. But some also rely on the data readily available in CRONUS, like Sales & Receivables Setup or VAT Posting Setup, data - configuration data - that is used in most relevant tests as-is. I guess this also relates to the fact that the Test Automation Suite is mainly executing integration tests. Thus needing to use an complex set of data. To avoid to have to recreate each part of this set with each execution of a test (set), I guess, MS has chosen to make use of some data readily available in CRONUS and making the test execution faster.
So on one hand, having test create their own data, it is of no importance in what database we execute them. On the other, as described, there is some dependency on data in CRONUS and so we are using CRONUS for our tests.
Enough about this baseline, for now …
Or at least the test tool should be executing the first test codeunit, 134000 - ERM Apply Sales/Receivables. And of course continue with all the others. In our case this very first run took approx. 1 hour. Which is quite fast as I had ran the standard Test Suite already on standard CRONUS taking approx. 2,5 hours. That it only took an hour was directly due to the fact that only a minority, 23%, of the tests passed and therefor the execution of all tests was much shorter.
But more later, on the results and how we went about getting the failures succeed.
That's it for now and next time don't tell me you still haven't tried to "Run Standard Tests against Your Code".