For those of you that have been following me for a reasonable number of years: you might have stumbled across my testimonial. It clearly marked a threshold in my professional carrier, and in hindsight it was worth all the while taking it. And now I am on the verge of adding another 7 years. Next week Thursday, to be precise, when I will be down south in California, when conference season has just taken off. My conference season and a whole new training season too.May I invite me to join me in my sessions?
Yep, I will be there once again. It's three years already that I attended it last time. Way back in 2015, Orlando, Florida. This time, San Diego, I couldn't let it just pass by. With the great memories of the 2014 conference. With the tremendous changes the NAV world is facing. Not in the least the namechanger, Business Central, and the role automated testing finally is going to play in our daily development practice.Did you have a look at the program already? Nope, go there and count the number of sessions relating to automated testing. At least five, more than ever before.Let me seize the change to invite to at least two of them. And yes I'll be the speaker. ;-)
This is going to be a whole new experience in my professional live. The first time to join this biggest Dynamics end-user event around the world. And it will be in Phoenix. It has been ages ago that this Greyhound bus brought me and my youngest brother out there. Youngsters, criss-crossing the States, off to the Grand Canyon.The other day someone remarked: this is not your target audience. Hell what, he might be right, but why should I care. It might be, it might be not. I am going to experience it. Going to teach some pre-conference classes . And, o yeah, do one exiting session on the conference itself. If you're working at a Dynamics NAV end-user and haven't registered yet? Go there [url=] and do so. At least have a look at my classes and presentation:
The 3rd this will be run at Housing.si. For both technical and no-technical Dynamics NAV pros. It's always a pleasure getting out there Me and Rob Gabriels are looking forward to conduct this academy ones again.
A 2days workshop that will cover the same topics as my NAV TechDays workshops. For those that could get a set in Antwerp. And my first time to out to the Baltics. Really looking forward to it.
Near home, but unfortunately not. At least no response to my last proposal, so far.
You might have noticed that we have been announcing this a couple of times before. And even though there was a big interest from various partners and end-user companies, our initiative suffered from the tight labor market. Big interest, too low a number of attendees. But finally some things got moving, and now it's really going to take off. A 5 weeks course spread over three month, to get both functional and technical Dynamics NAV newbies up and running.And yes, it's held in the Netherlands, but it's open to anyone around the world. Kerkrade is almost as south as you can get in this country. Next to Aachen and Maastricht, and just stone's throw away from Luik/Liège.Still seats available for both the technical and functional track.
Being going out to Slovenia for almost 4 years now, a big number of Dynamics NAV pros joined the various courses and workshops I have been performing at Housing in those years. Please be invited to this event. An informal gathering - a reunion ;-) - with good food and good news.
It might the last one called NAV TechDays, but probably always remain the-best-you-can-get conference on technical matters regarding Dynamics NAV and D365 BC. Long sessions, relevant subjects, and some great speakers. No, no, no, not me, Surely not this year at the conference, but, hack, yeah, automated testing workshops. You need yourself get starting on this:
Or freely translated from an old Dutch saying (Doe de deur dicht! Je bent toch niet in de kerk geboren?): Close that FastTab! Have you been born in church? ... meaning: keep the door closed, do not spoil the heat inside.
And that's what this small post is about: have a better performance by collapsing the FastTabs on your pages.
After our upgrade to NAV 2018 we experienced a number of performance issues. One I blogged about some months ago. Another concerned the loading time of the Item Card. Simply opening the Item Card could take up to 11 seconds! So I played around removing various parts from the card page. Removing various FlowFields. No real improvement it seemed. Moving sub-pages, added as customization, to actions. Wow, that was much better: 3 seconds loading time. Another second was won by removing FactBoxes. So I was quite pleased with what I achieved and deployed it to our acceptance environment for my customer-colleagues to test.
To my dismay they reported back that they did not experience a real improvement. The loading still took almost the same while as before. Well, Luc, you should not lie, it did improve! Loading time went down, but it not as dramatic as my tests showed.
Ouch, that was a disappointment. Or even more: a riddle. So we ran some more tests. On my local environment. On our test environment. Our product owner did his tests. Etc. And every time I loaded the Item Card it was significantly faster than in his case. Even on the same environment.
So, what he h*** made the difference? We even joked about my privileges being an MVP. And then we saw a stupefying differences. The number of FastTabs that were collapsed! The more that were collapsed, the faster the loading. Wow, never knew about that. A great feature and a major difference with the olden day forms, were all data was loaded even though that Tabs were not displayed (yet).
But how to help my colleagues to easily close all FastTabs and only open the ones they really need? Would be great to have two buttons on each page with FastTabs that allows you to collapse and elapse all FastTabs in one touch. Being in contact with MS on this issue I immediately applied for it. Let's hope we get it soon.
So ... collapse those FastTabs and don't spoil performance. You're not born in church!
Start of this year we upgraded our NAV 2016 installation to NAV 2018 RTM. And February 26 we were live. It was a relative smooth and fast upgrade and we were quite pleased with it. We ran however into a couple of performance issues of which one them exposed itself immediately after go-live. We were pointed to it by one of our sistering team's consuming a number of NAV web services in their application. But unfortunately we couldn't get a hold on what was exactly happening, so it was lingering until a month or so later we were confronted with a seemingly different issue, but with the similar pattern.
Our sistering team reported that, while sending data to NAV through a web service the following exception occurred frequently:
For search engines sake, let me phrase the essential part of this even viewer screen:
The sql connection doesn’t have a database instance attached. This must not happen if the connection comes from the pool.
We had no idea whatsoever on where to find the cause (and solution!) and we let it be for the time being. It was not a showstopper in the operation as the major part of the web service calls were successful. Until some 6 weeks later, when a new project had started with an external partner connecting to a new web service, experiencing a same kind of issue. Roughly every 10th call connecting to NAV failed resulting in a 1000 longer waiting time and timeout:
“Apr 25 07:33:01 Update box: Time: 273 msApr 25 07:34:02 Update box: Time: 278 msApr 25 07:35:01 Update box: Time: 285 msApr 25 07:36:01 Update box: Time: 274 msApr 25 07:37:02 Update box: Time: 282 msApr 25 07:38:01 Update box: Time: 279 msApr 25 07:39:01 Update box: Time: 288 msApr 25 07:40:02 Update box: Time: 355 msApr 25 07:41:01 Update box: Time: 272 msApr 25 07:42:02 Update box: Time: 272 msApr 25 07:43:01 Update box: Time: 266 msApr 25 07:44:22 Update box: Time: 21.40 secApr 25 07:45:02 Update box: Time: 291 msApr 25 07:46:01 Update box: Time: 280 msApr 25 07:47:02 Update box: Time: 285 msApr 25 07:48:01 Update box: Time: 274 msApr 25 07:49:01 Update box: Time: 268 msApr 25 07:50:02 Update box: Time: 290 msApr 25 07:51:01 Update box: Time: 291 msApr 25 07:52:01 Update box: Time: 272 msApr 25 07:53:02 Update box: Time: 288 msApr 25 07:54:01 Update box: Time: 283 msApr 25 07:55:01 Update box: Time: 284 msApr 25 07:56:02 Update box: Time: 276 msApr 25 07:57:01 Update box: Time: 282 msApr 25 07:58:02 Update box: Time: 270 msApr 25 07:59:23 Update box: Time: 22.29 sec“
Our request for help on this issue towards MS was answered very fast resulting:
Our team investigated the exception
The sql connection doesn’t have a database instance attached. This must not happen if the connection comes from the pool.
and discovered that we have fixed a bug for Dynamics 365 Business Central titled
Connection pool - concurrency issues when disposing NavSqlConnectionScope
that has not been backported to NAV2018 (and the exception trace that you provided looks similar to the one we saw when fixing this bug)
We have created a backport bug to have this fixed in an upcoming CU for NAV2018, preferably CU 06
And indeed it was released in CU06 for NAV 2018 as hotfix 267998. Once installed it did indeed fix the issue. No hiccups when disposing NavSqlConnectionScope.
Thanx to the MS team that helped us out.
More or less coincidentally I stumbled over the topic of this post a couple of weeks ago in my next endeavor to raise the success rate of the MS test suite on our solution. A task being picked up every once in while, next to all the normal work.
The major approach so far in getting as much as possible MS tests working has been a statistical one: look for the most occurring error and get it solved; subsequently take the next most occurring and so on. In 1 week the success rate raised from 23% to 72%. One week more we did get it to almost 80%, i.e. 79%. After six weeks altogether, spread over a couple of months, we hit the 90%. As in many other cases the Pareto principle also did apply here, and thus the remaining 10% will take almost as much time to get it successful. And you could wonder if that's worth the trouble. Well, this blog post could be seen as a proof for taking the trouble still. It has been worthwhile to continue to raise the success rate. And actually I deliberately continue for this very reason as we also started this project: to learn from the automated tests MS did conceive. And in the meanwhile we do get more tests working.
So how did I "stumble over"? Instead of using the statistical approach as pointed out above ("solving the most occurring error") I looked at the left side of my percentage graph (that's where you can find the lower test codeunits that hold the tests that are most meaningful to us; where those on the right are for a bigger part testing production and warehouse management, at this moment of lesser or no use at all to us):
And I focused on the white ditches between the blue. The deeper the lesser successful. And maybe the easier to get successful.
The one that struck my eye most was the one white all the way down.The narrow grand canyon.Or gorges du Tarn, being a Francophile. Do you see it? Somewhat left of the middle. 0% success in this test codeunit. It turned out to be COD134926 (Table Relation Test). To be honest I didn't look at the name, and opened it in design mode to find out there was only one test function. Was I going to put my time on getting this fixed? So I looked at the codeunit name, I looked at the error thrown and studied the code closer.
Table Relation Test
The type of Table 23 in Field 50012 is incompatible with Table 50026 Field 1.
LOCAL ValidateFieldRelation(VAR TempTableRelationsMetadata : TEMPORARY Record "Table Relations Metadata") : Boolean
Aha, it's checking the technical validity of a table relation between source and target field that make up a table relation:
Or as the in code explanation says:
// <Summary> Fields must have the exact length of the largest field they relate to.// <Summary> Fields must have the same type as the fields they relate to. Exception: if it relates to both Code and Text it must be Text
Wow. We have been cleaning up our code a lot in the last year, but hadn't been able to get this tackled, and now we did hit upon this nice gift. It pointed out - ouch, do I dare to say this - ... more than 50 incompatible table relations. Needless to say these were injected in our code long, long ago.
If, after all my previous posts on the test toolkit, you're still looking for reasons to start using it. Now you have surely one (more), be it a small, and very useful one.
Tomorrow exactly one month ago I wrote this post, based on the update in a number of test codeunits in NAV 2018 CU3: Let's talk about Shared Fixture and how to profit from this with the Dynamics NAV Test Toolkit. Just now I finished upgrading our test code to NAV 2018 CU4. It was a lot of work as I had to manually check a lot, but, wow, MS did an even greater job. The redesign introduced in CU3, and implemented on the first 100 test codeunits, now has been continued on an even larger number of test codeunits. Due to this I was able to revert our customization substantially:
# Customized Test Codeunits
NAV 2018 RTM
NAV 2018 CU3
NAV 2018 CU4
Keep the good work going ... (as I have still 200 of 720 customized ;-)