I’m not going to blog about each version of this module, but for this update, I felt obligated to share a few words. Primarily, because all has changed :-). Not one function is the same. I renamed all of the functions to a more convenient name – as they were quite long (not to say: ridiculous ;-)).
For that reason, I have edited my previous blogpost, where I introduced the new module. Still, since I (think I) made it a little bit easier for you to get started with it, let me share a few short paragraphs on the tool again (some is repeated, but you know – Repetitio Mater Studiorum Est ;-)).
This tool has been created from my own development experience, where I have:
So, it might be all on my laptop or partly on Azure – in any case, it’s 3 machines, which needs to be able to be managed likewise – by running PowerShell scripts remotely. And THAT’s the purpose of this module – to make that somewhat easier ;-).
To get started with this module, there are two things you need to do first.
Enable Remote PowerShell on the Docker Host
You need to make sure that your docker host is remotely accessible by PowerShell. I’m not the expert in that, but what helped me was AJ’s blog and also this one to be able to set that up.
Install the dependent Modules
The module is dependent on some other modules, depending on the functionality you are going to use. Making these modules available is quite easy. I created a function for that, which is part of the module: Install-RDHDependentModules . Just run it like this, and all is set up:
-DockerHost $DockerHost `
-DockerHostCredentials $DockerHostCredentials `
-DockerHostSessionOption $DockerHostSessionOption `
In a future version, I will probably include this in some other functions, just to prep the container if not already prepped.. .
You’re good to go!
So, when above is done, you’re good to go. For each remote environment, I tend to make a separate folder, like you see in here. Each folder holds a set of scripts, where one is the settings-file, with usually specific sessions for this docker host. This settings-script is usually ran first in any other script.. . It makes it easy: just figure out the remote execution settings once, and you’re good to go for all your specific scripts. Just take these folders as an example.
Also useful for upgrades?
You see a folder “UpgradesOnWaldoCoreVM” – this is now how I actually perform my upgrades – so yes, definitely useful. But how that works – that’s for a future blogpost – because I still need to clean it up a bit ;-).
I have been struggling in getting my head around managing “stuff” on Docker. You know – the new kid in town that will see its use in many fields. We implemented a Docker-based environment for Build-server, for local dev, and now also for test servers for customers. And as you know – you can use is for much more!
In any case, you need docker “somewhere”. Some people install it on their Windows 10 – but that would be something I would never recommend, because in my opinion, it belongs on a “server”, simply because the memory management is much better and all that.. .
You are probably familiar with AJ’s blogpost on how to setup a nice local Docker Environment based on a Windows Core in a Hyper-V. I find it the ideal way to work with docker images locally, to be used for local development, testing, previewing, … you know .. a flexible and clean way to manage many different builds of Dynamics NAV.
You are also quite familiar with the navcontainerhelper, probably. I absolutely love that PowerShell module that Freddy created. It basically solves the problem to call typical Dynamics NAV cmdlets remotely. When you look at the code (it’s all available on GitHub, by the way), it’s full of “Invoke-Command”, meaning that a function in the navcontainerhelper-module executes a number of NAV cmdlets remotely on the container. It’s much easier to work with, because you avoid having to enter the container session, and execute the cmdlets one-by-one.
So, what is this new module all about then?
Well, the downside of having a windows core docker host – or any “remote” docker host for that matter – is that you still have to enter remote sessions, and execute PowerShell from there. Just consider the following scenarios:
So, in both cases – you need to log into another machine. And all I want to do is repetitive tasks time and time again.
That’s where my new module comes into play. It’s definitely not a replacement of the navcontainerhelper – in the contrary – it’s dependent on it for some tasks. But it’s one step further, for the cases you are not working on the same machine as you have installed Docker – which appear to be the majority of the times, in my case.
I created two examples. Basically for me to be able to test the module .. But also to illustrate in what way you are able to use them. All scripts where I use the module, can be found on my github in this folder here: https://github.com/waldo1001/Cloud.Ready.Software.PowerShell/tree/master/PSScripts/NAV%20Docker/RemoteDocker
You see it contains a number of folders. Each folder represents a scenario. So, it contains scripts to be run against an AzureVM (where you enabled Remote PowerShell), and the other folder “waldocorevm” works against my local VM that contains docker.
Both folders contain a _Settings.ps1, which basically contains the connection details for that specific scenario. All the other scripts in the folder, work with these settings and with the CRS.RemoteNAVDockerHostHelper-module, to do what it’s entended for.
For example if you want to test to upgrade an app, just look at the UpgradeApp.ps1 and you’ll see it’s quite simple. No remoting, nothing, just calling a “simple” powershell function from the module, and all will be done for you in the background. Here is an example of the clean-script being executed from my laptop, but is going to remove all apps on a container of my VM on Azure:
I know, nothing much exciting, but if you know it is actually “remoting” into two sessions – well – quite cool, if I may say so myself ;-). This is the clean-function from the module:
One thing worth mentioning is that it’s also going to take care about your App-file that you have on your local machine, but you might want to publish way up there in the cloud. Just look at the Copy-NAVAppToDockerHost, which is used by Install-NAVContainerAppOnDockerHost for example :-). SImple, and efficient!
Where can I get this module?
Well, the easiest is to just download it from the PowerShell Gallery, by executing:
But it is also available on my Github alongside my other modules: https://github.com/waldo1001/Cloud.Ready.Software.PowerShell
Any feedback is useful! If you want me to address specific things, most convenient is to provide feedback on github. But on this blog is fine as well ;-). You are always welcome to fork into the project like many other already have done, by the way :-).
If you would like to read up on remote powershell, well, there is a lot of info out there. Just a few examples which were useful for me:
It’s the week of “undocumented features”, apparently. Today: code analysis. Apparently, the al-language extension comes with a built-in analysis for your al-code. Thing is: it’s disabled by default. Just go to settings (File/Preferences/Settings) and enabled it.
By adding the line to your settings.json of your environment (the right pane). Here is my settings.json-file:
You will get quite a number of “topics” it will analyse your code for. On the points it has issues with, it will show you a green line in your code, and you can see the exact “problem” as a tooltip when you hover over it, or in the “problems”-pane (CTRL+j / Problems). Here are just a few examples:
In this piece of code, I have 3 problems, which you can read below:
When hovering over the symbol with green line, you can see the problem. In this case, an unused variable:
What do you think of this one: you risk an overflow when you use this code:
Or one of my favorites: code that will never get executed:
Can I create my own analysis?
You can, but it’s a little bit cumbersome. You can create your own analysis-dll, drop it in the extension-folder, and go from there. I haven’t had a look at it .. because I’ll wait until this undocumented-but-already-cool-feature gets documented-and-even-cooler and supported, so we are sure that the work we put into it, is worth it. In the mean time: thumbs up, Microsoft!
I was told it does have a performance impact, so I guess that’s why it’s disabled by default. I haven’t noticed too much of an impact myself, to be honest.. but you are warned ;-).
I recently discovered what appears to be an undocumented feature in the current version of the al language extension (aka, the “Modern Dev”, aka DynDev365, aka ExtensionV2, aka VSCode Development Tools for Microsoft Dynamics NAV). And that is “Automatic Object Numbering”
On any objecttype, when enforcing intellisense, you get the lowest available number that was defined in the app.json.
Consider this example, where I have this config in the app.json:
When creating some tables, it gives me the lowest available number in intellisense:
It’s just a matter of engaging intellisense at the right context..(CTRL+Space). It works obviously also when putting your objects in individual files.
Does it also work for fieldnumbers?
What ya think? Of course it does :-). Here is proof:
From which version is it available?
This is obviously part of the functionality of the VSCode extension, which is still under heavy development (although able to use for NAV2018 development though …). I’m personally using version 0.13.15836, which I think is the anniversary update:
Good stuff! Keep it comin’
Today, I was prepping for my 2-day Masterclass about developing Extensions in Visual Studio Code. And finally, I made some time to make a first version of a function that I have been wanting for quite some time: a function to remove all the custom apps from a docker image.
What is the challenge?
Well, if you’re used to Extensions V1 – then you’ll notice things have changed on many levels :-). Let’s just say that it’s darn difficult to loose your data. And in my case here – I want to do exactly that. I want to completely clean my custom apps from the Docker container, including data.
By default, when you uninstall or unpublish the app, the data will not be removed. It will simply stay there in the companion tables or dedicated tables for the App – waiting for you to reinstall the app, or upgrade the app.
Why would I want to remove the data?
Well, just imagine development scenarios, where you are working on multiple apps (dependent or not) on one system, and you want to start from scratch again.
Or – in my case – when preparing for demos, and you want to – again – start from a clean sheet.
OK, show me how!
Well, as said, finally I made some time to put it into a function. And for your convenience, I already put it on my github, you can find it here.
And you see it’s simple. It assumes you’re already using the navcontainerhelper – which you should use when using NAV on Docker (in my opinion). More info on Freddy’s blog.
It is going to search for all apps which are not Microsoft’s apps, loop them, and uninstall, clean and unpublish all these apps one-by-one.. .
You can simply call this function like:
Clean-CustomNAVAppsOnDocker -ContainerName navserver
Are there other ways?
Well, you could obviously just ignore this entire blogpost, and simple replace the docker container by re-installing it. But in this case, you might lose data that you might depend on, or settings, or accessibility, or … .
If Docker still looks a little bit like this to you:
You might just want to reconsider re-installing the container, and just go for the function I talked about ;-).