NAV Performance is often discussed in technical terms like index tuning and hardware sizing.
I would like to share some thoughts on how settings in the application also side affect performance. I hope this post can be of interest for some non-technical people who is engaged in system setup.
Performance as a shared responsibility in a project
The way we set up the NAV application can have a huge impact on performance. My experience is that performance often seen as something that just needs to be fixed by the technichians no matter what functionality is used or how the parameters are set in the application.
Lately I have been involved in early stages in some projects to talk to customers and consultants about performance and how they set up the application can have an impact on performance. This has been great since all parties involved starts to recognise performance as something that needs to be considered during a project.
Size vs. Features
NAV is used from really small installations to hundreds of users. The features that You can/should setup in the application depends on the size of the implementation. In a database with 10 users on a decent server You do not need to worry so much about performance. You can go ahead and use dimensions, analysis views, automatic cost posting etc. Your budget probably doesn't allow a true BI platform either so therefore You use NAV as You reporting and analysis tool and need that kind of setup.
But when You are going to use NAV with 50+ or maybe 100/200+ users You need to start think carefully about how to set up NAV and what functionality You plan to use. I would recommend You to look at some of the pre-built BI-packages that's ot on the market. In a big implementation should be careful in setting up NAV to support multidimensional analysis, split Your income statement to much etc. There are great BI-tools out there that will do that analysis for You without the need for a lot of dimensions and a huge chart of accounts.
OLTP vs. OLAP
The bottom line is that the bigger solution the more important it is to focus on the OLTP capabilities in NAV and not use so much of the BI capabilities. You can't have both OLTP and OLAP in the same database in a large installation.
Which features in NAV is it that we need to think twice about when it comes to performance? Well. I've tried to compile a list below with some some of the most important areas. Please remember that this is written strictly from a performance perspective and apply to bigger installations with typically 50+ users and 50+ GB database.
From version 3.X we have the possibility to use more dimensions than just the two global dimensions. It is common practice to use dimensions for gegraphical area, customer group etc. These kind of dimensions are typical in a BI Solution where they are built up from information from the customer card instead of being stored togehter with the transactions.
You should try to keep the use of dimensions to a minimum. If possible I would recommend You to only use the two global dimensions and instead put some money in a BI-solution to take care of the multi dimensional analysis. A BI solution does that a lot better than NAV. Think NAV = OLTP.
I wish Microsoft would give us an option to choose wether or not we want to use dimensinon entries or not (it's probably just a matter of a disabling the DocDim.INSERT in the function UpdateDocDefaultDim in codeunit 408). That would of course affect functionality in NAV that are based on dimensions, but could be a great thing in a large implementations where the BI is done outside the NAV box. Anyone who has tried this?
The cange log was created to enable an audit trace on application setup. As such the change log is great. But often people are a bit to creative. I often see the change log activated on hot tables like Sales Header/Line. The problem is that locking mechanisms boils down to the change log entry table and if You set up change log on many tables updates in each table will affect locking on all other tables that are being logged.
So use change log with causion. Do NOT use it for hot tables and avoid using it for master data. If possivble only use the change log for setup tables like Chart of Accounts, Posting Setup etc.
I think it's quite obvious already what I think about loading NAV with a lot of functionality for reporting and analysis. In a small database it might be nice to have, but in a larger company I strongly advice to look at a BI solution instead and optimize the settings in NAV for transaction crunching. If You are going to use analysis views they should at least not be updated when posting. Use the batch job instead to update them.
Automatic Cost Posting
In the inventory Setup You can choose wether or not You want automatic cost posting or not. If enabled it means that for every inventory posting the General Ledger will be updated. If You don't need the COGS and inventory updated instantly You can post inventory with the batch job instead.
Automatic Cost Adjustment
This option makes it possible to automatically adjust cost on inventory decrease and how far back NAV should look for inbound entries to get the cost from. This is the same logic that's executed by the batch job for cost adjustments and this come with a performance penalty. So my advice is to set this option to "Never" and run the batch job for cost adjustment instead.
Expected Cost Posting
Expected Cost Posting means that NAV post to general ledger when shipment/reciept is posted and reverse that entry when the invoice is posted. This creates extra postings and should be avoided if possible.
The chart of accounts and the posting setup might seem trivial and it's easy to just set up more accounts and posting setups. But You must think of the consquense when e.g. posting an invoice. The more combinations You have in the posting setup, the more G/L entries You will get. Togehter with dimensions it can be a large numer of entries in the database for each posting.
A small example:
Let's say You have a sales order with 10 lines and 10 different prod. posting groups on these. Let's also assume that You have 4 dimensions on each line and automatic cost posting enabled. When You ship and invoice You will get a total of over 300 dimension entries. If You go down to 1 posting group for items, use only 2 dimensions and disable automatic cost posting You will have less then 100 dimension entries. That's a factor by 3. This is no exact math I have done here, but the choise of how to set up posting and dimensions do have an impact on the database load.
In the sales setup You can choose wether or not You wan't separate G/L Entries for line discounts. It's not unusal to see this option enabled but at the same time the same accounts setup for sales and sales line discount in the posting setup. If You have line discounts on all Your order lines You can double the number of G/L entries from the sales/purchase lines by this setting.
Dynamic Low-Level Code
This options tells NAV to assign and calculate low-level codes for each component in a product structure. If You create a lot of production structures this can slow Your system down drastically. I was engaged in a case where a customer had this enabled togehter wit a lot of make-to-order and it was a disaster.Just unchecking this check box really made a huge difference!
If You disable this option You must run the batch for calculating Low-level code manually before You calculate a plan. But it's a lot better to do it just then instead of every time You release a structure.
If You use cubes in Analysis Services You usually build Your dimension hierarchies based on fields on e.g. the customer card.
On the customer card You have fields for country, customer group and customer no. and name. That's a typical dimension hierarchy for the customer object in a cube. The transactional data is linked to this hierarchy based on the customer no.
When You try to achieve something similar in NAV with dimensions You usually set up default dimensions on the customer for country and customer group. Then for each transaction there are dimension entries posted for these dimension levels together with the transactional data. But how often does the customer change country or customer group? And my experience is that when that happens our customer (that's running the NAV system) often wants’ the historical data updated (which is not needed when You build the dimension hierarchy based on data on the customer card.
So by using an OLAP solution and data from customer, item etc. to build hierarchies You do not have to use dimensions to the same extent in NAV.
One word about Dimensions....
If you minimize number of dimensions setup in the system (and information collected during order entering) how would you restore or add this information later in BI solution ?
Great post, Lars.