Microsoft Dynamics NAV/SQL Server Configuration Recommendations

MS Dynamics NAV team has published recommendations for how to setup SQL for NAV. This is a great initiative since there has been a lack of this kind of documentation for a long time.

http://blogs.msdn.com/b/nav/archive/2010/09/28/microsoft-dynamics-nav-sql-server-configuration-recommendations.aspx?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed:+MicrosoftDynamicsNavTeamBlog+(Microsoft+Dynamics+NAV+Team+Blog)&utm_content=Google+International

Here's some comments on the content:

  • We are recommended to leave auto create/maintain stats enabled.
    On the contrary I have seen terrible performance (mostly in drill down from chart of accounts) due to auto create/maintain stats. Once I have used Jörg Stryk's exellent script for deleting the statistics that have been created automatically the problems have dissapeared. Of course You need to  set up a job for nightly maintenance of statistics with "indexonly" if auto create/update is disabled. 
  • Number of Tempdb files
    It's interesting to see that we are still advised to use one tempdb file per CPU. There is a note that this is due to SGAM page contention. This was adressed in SQL2005 but I found this when googling on it and it seems as a reliable source so it seems as the advice is still valid
  • Default Isolation Level
    There are some interesting notes about changing isolation level from SERIALIZABLE to REPEATABLEREAD. I'm not that deep into isolation levels and would like to hear if there's anyone with practical experience from changing the default behaviour.
  • Index Deframentation
    I'm glad to see that Microsoft has published a script for index defragmentation.

/Lars

 

Comment List
  • I also read the recommendation to leave auto create/maintain stats enabled. We also prefer to turn auto create/maintain stats of on solutions with huge loads.

    It would be nice to see some autoritative testing on this!

    As a sidenote I have experienced that it is necessary to update statistics more than once a night when processing huge invoicing batchjobs otherwise performance turned out to be very poor.

    But those stat updates are executed in under 5 minutes when the update statistics job is run.

    I think that the cost for having auto create/maintain stats on would be way more than those 5 minutes needed to run the statistics update - but again it would be nice to see some testing on this.

    But then again the need for statistics updates depends on the amount of data processed and the number of tables/indexes used and how they interact which makes measuring this fairly complex.

  • That Index defragmentation scipt is nothing new.  It's been around for years.

Related
Recommended