What I found and what I know about Microsoft Dynamics NAV
After long time, I have found some time to write this article.
Last few months was full of “Crete” testing including the PowerShell scripts to merge NAV Objects. And I must say, I have started to love PowerShell. Things, you needed to do manually or you needed to use some “hack” to do automatically (like using ROT), you can do now using PowerShell. And I started to create some scripts for things, I just needed actually do. You can find the scripts on the NAVScripts codeplex project, which is open for you and if you want, you can contribute. I want to have this project to store different scripts “made by NAV developers for NAV developers”.
Actually you can find these scripts there:
this script takes two branches in GIT repository and merge the objects in them. You can solve the conflicts in kdiff3 or araxis merge (or any other tool if you want). This way you can “work around” standard GIT merge mechanism. I have merged the Rollup Update 9 to our Addon database in just 5 minutes! Merge of the updated addon to our customer customized database – another 3 minutes of work! In this way, you get the nice “train station plan” (my internal naming) like this:
another script, which you can use after you merge the objects. This script is doing these steps for you automatically:
As you can see, result should be FOB file with compiled objects ready for import to target database. It is something like “Automatic build process”. You can easily add part to run automatic testing…
In the scripts there are many functions you can use for own scripts. You can export/import/compile objects to e.g. update the files in repository, update you database from the repository, merge version lists etc.
I hope that you will find these scripts usefull and you will be able to extend them as you will need. All is available from the codeplex here:
You can even fork/clone the GIT repository and contribute…
Every NAV developer should learn powershell, because it could save big chunk of time for us. And with the accent to the up-to-date customers, we will need to press the upgrade cost down. And you cannot do that without automating the upgrade process. Correctly written scripts are good tool to do your work. They will never replace you, because still there are exceptions which needs some human brain to solve, but it is not 100% of the process now, but e.g. 10%.
Merge of our new version of addon to some customer database was 1-2 days in some cases. Now, with the script support, it is just 2-3 hours, in which the 1.5-2.5 hours are automatized and only 0.5 hour is "conflict solving".
You can find many sources for beginning with powershell, but one for all: https://www.youtube.com/watch?v=MnWKPdkGFSU
interesting post, thank you. One question, though: Do you let git handle the the initial merge and only the cmdlet when there is a conflict, or do you always use the cmdlet? My impression with svn and mercurial (also git, but to be frank, I didn't quite grok git yet) was that they use revisions or change sets for the merge. Meaning, when you have a development tree and merge the head (or a revision) into another tree, all revisions/changesets to the common ancestor will be considered in the merge. This is a PITA on svn, with mercurial I'm quite satisfied... if not for the usual header conflicts and some other issues related to new functions and the comments section. With mercurial (or git) you will get the conflict only once, so I can live with it.
I think it's also noteworthy that you absolutely require a version control system and base version tree in it these days. I would prefer to have a repository online, from which I could clone/fetch. To date, I spend some time each month to integrate the Cumulative Updates into my trees. Also, there is no nice solution for language layers :( I keep a DE and a W1 tree for this reason, and merge/link them for the same build levels.
with best regards
@jglathe> I am using the cmdlet to merge instead the git tools. The cmdlet does the merge and than call git merge, but with "ours" merge strategy, thus keeping what the script prepared as a result. The script is solving the header info merge too (the default cmdlet just takes header from one selected version, script is really "merging" the version list...). Because I am each time merging same language versions, I have no problems with languages, but I know that Microsoft is working on it and I think in some next version it will be solved...
I have tried to create own diff and merge driver for git, but I have problems with calling powershell scripts as git custom drivers (it is prepared for command line commands etc., but how to easily use powershell and do not need to again and again import powershell modules for each triggered command...). And the performance of this is terrible, because it will merge files one by one, and the cmdlets are not "happy" with that. It is why I am taking the copies of the revisions to the temp folder, merge them in one go, solving the conflicts and return the result back to the repository... It is not ideal but quite good working solution.
Regarding the repository - yes, I agree, but I understand why it is not like this. Few years ago I was thinking about creating such a repo e.g. on GitHub, but I am not sure if there are not some legal problems with that.