Van Vugt's dynamiXs

All around NAV dev and test

How-to: Run Standard Tests against Your Code

Let's confess before I start: getting standard tests to run against your code is by no means rocket science. However, I dare say many peers out there haven't even touched the Test Toolkit, seemingly looking at it as an unbearable threshold. Too much work. Of no relevance to their code. Unknown = unloved (as my mother used to say).

Clearly the majority of the audience I have been speaking to on the Test Automation Suite in the last year and a half, never laid hands on it. I hope I did trigger some of them though, to get started. And if they, and you, still have this cold water fear to get started … this post will show you there's no reason not to. While showing how easy it is, I will also touch on some backgrounds; of the Test Toolkit and test automation in general.


Now what do we need?

  1. A clean CRONUS database serviced on a SQL Server
  2. A full NAV installation with DE (development environment) and Windows Clients and Service Tier connecting to the CRONUS database
  3. Developers license to run your service tier and DE with
  4. Test Toolkit .fob files, to be found in the TestToolKit folder on the product DVD
    • CALTestCodeunits.<country>.fob, containing all test codeunits
    • CALTestLibraries.<country>.fob, containing all generic library codeunits used by the various tests
    • CALTestRunner.fob, containing the test tool framework; note that this already resides in CRONUS so actually no need of this
  5. .fob with your app code

Needless to mention that all resources should be based on the same version. In our case we are still running on NAV 2016: code on RTM (build 42815), executables on CU11 (build 46773). And BTW: no direct need for any fancy hardware, virtualized or not. My first steps were done on my laptop!


O yes, this CRONUS database. Why are we using a CRONUS database? Probably like you, many participants in my sessions wondered. Why not run it on our own database, containing our data: setup, master data, ledger entries, …?

One of the principles in testing is to have a common baseline when (re)executing tests. To give the best guarantee that a test is reproducible. A in, B out. Every time again. No data created in between two runs of the same test (set) that results in a different outcome. Well, that's not a ground vast argument for using CRONUS, right?

However, another principle is that each test (set) creates its own data needed during execution. And ideally the full Monty, meaning from scratch. Every time again, making a test (set) independent of any data already residing in the system … ideally.

As with many things in live it's not always as ideal as we would like, or as we say. This also applies to the Test Automation Suite.

Yes, the majority of the tests create their own data on the fly, like customers, sales documents, ledger entries, etc. But some also rely on the data readily available in CRONUS, like Sales & Receivables Setup or VAT Posting Setup, data - configuration data - that is used in most relevant tests as-is. I guess this also relates to the fact that the Test Automation Suite is mainly executing integration tests. Thus needing to use an complex set of data. To avoid to have to recreate each part of this set with each execution of a test (set), I guess, MS has chosen to make use of some data readily available in CRONUS and making the test execution faster.

So on one hand, having test create their own data, it is of no importance in what database we execute them. On the other, as described, there is some dependency on data in CRONUS and so we are using CRONUS for our tests.

Note that …

  • … we also ran the test on a copy of our own database, but this seemed a never ending exercise; it took ages for the first tests to complete, which seemed related to the huge number of item ledger entries in our database; however, we did never perform a deep analysis on the cause of this as the above shows this is actually not relevant.
  • … the Test Automation Suite has been setup in such a way that after the execution of a test codeunit (a test set!) NAV automatically rolls back all changes made to the database. In this way each test codeunit starts o the same data baseline.
  • ... on Jan Hoek's request I did elaborate on this here and here.

Enough about this baseline, for now …

Ready … GO!

  1. Connect to this CRONUS database with your DE
  2. Import the first two Test Toolkit .fob files as mentioned above
  3. Import the .fob with your app code
  4. Recompile all objects to be sure that everything is OK
    In our case not everything appeared to be OK, due to
    • Missing .dll's
    • Calls to customized standard functions by standard test code
  5. If not OK, fix the issues
    BTW: this wasn't a big task to do, but of course might be totally different in your own case.
  6. Open Windows Client and open the Test Tool (Departments/Administration/Application Tools)
  7. Assuming you have never used this feature in this CRONUS database it should show like this:

    A worksheet page with no records, which in this case means that no test codeunits have been added to the DEFAULT test suite
  8. To add test codeunits to the DEFAULT test suite click on the Get Test Codeunits action:

    As we want to run all available test, select the 2nd option, All Test Codeunits, and click OK.
  9. Now all test codeunits (and functions) will be added to the DEFAULT test suite. This might take some time, but once ready …
  10. GO, by clicking on the Run action, …

    … selecting the 3rd option, All, and pressing OK.

We're in business!

Or at least the test tool should be executing the first test codeunit, 134000 - ERM Apply Sales/Receivables. And of course continue with all the others. In our case this very first run took approx. 1 hour. Which is quite fast as I had ran the standard Test Suite already on standard CRONUS taking approx. 2,5 hours. That it only took an hour was directly due to the fact that only a minority, 23%, of the tests passed and therefor the execution of all tests was much shorter.

But more later, on the results and how we went about getting the failures succeed.

That's it for now and next time don't tell me you still haven't tried to "Run Standard Tests against Your Code".