This week I conducted a pre-conference workshop at NAV TechDays 2015 in Antwerp with the subject:
Implementing the RoleTailored client with success
The participants spent eight hours learning how to change Microsoft Dynamics NAV implementation into a succesful customer story.
The workshop was based on my new book:
RoleTailoring Microsoft Dynamics NAV
It is my pleassure to announce that the book if now available for purchase and download from my webshop:
and not only that, the book is also available as a FREE preview. You can find the preview at:
If you would rather purchase the book as paperback, it is possible to preorder the paperback for delivery December 15th.
Enjoy the book.
My Jobs functionality
For companies working with jobs a lot, it could be beneficiary to have the same functionality as the My Customers or My Items, and it is quite fast to make:
1) Find the My Customers table and page
2) Copy them to My Job table and page
3) Change Customer to Job all over in the objects
4) Add the My Jobs page to the Project Manager Role Center
5) Adding a new custom filter in codeunit 41
The Role Center page:
Here is the result:
Now all that is needed is to create a new custom filter in codeunit 41 as described next.
Creating new custom filters
In Dynamics NAV it is possible to use a number of custom filters:
%ME The returns the User id of the user that is logged on.
%USER Does the same
%COMPANY Returns the company name as in the company identifier from Companies and not the company name from Company Information
%MYCUSTOMERS Returns a filter with all the customer numbers from the customers the in My Customers page.
%MYVENDORS Returns a filter with all the vendor numbers from the vendors in the My Vendors page.
%MYITEMS Returns a filter with all the item numbers from the items in the My Items page.
How is it possible to create new custom filters?
In codeunit 41 TextManagement.
We can find this function:
So if I want to create %MYJOBS, %MYRESOURCENO or %MYSALESPERSONCODE I can add this code:
And change the first line in the GetMyFilter function:
Now it is possible to use the new filter in saved filtered views:
|If you liked this, you are going to love my new book: RoleTailoring Microsoft Dynamics NAV, which will be in your favourite webshop soon. || |
Until then, there are always the old ones:
Developing custom reports in Dynamics NAV is a costly affair. Therefore, I recommend all my customers to think about, which reports could be replaced with either Cues on the Activity page of the RoleCenter or as a filtered view.
One of the frequent examples are the “Slow Moving Items” report.
Here is a quick example of making this report as a filtered view. First, I go to the Item list page:
Then I add a couple of filters:
The filters are:
· The date filter is set to be within the latest 12 months of the work date.
· No items sold
· No items purchased
· Some items on inventory
Now save it as a filtered view:
When this is performed in configuration mode, this list will be available for all the users that have been assigned to this profile.
|If you liked this, you are going to love my new book: RoleTailoring Microsoft Dynamics NAV, which will be in your favourite webshop soon. || |
Until then, there are always the old ones:
After four years with my Mibuso blog I have moved to DynamicsUser.net.
Just a little statistics on the four years with Mibuso:
- 30 Blog posts
- 412.789 hits
- 39.545 unique reads
- More than a thousand followers
In the same time I have:
- Published three books (The next is on its way to the printer)
- Authored twenty-six “How-to-Videos” for Microsoft
- Reviewed the course material for C/Side Introduction 2013
- Conducted approx 80 courses about Microsoft Dynamics NAV
Thank you Mibuso for offering a relevant and necessary platform for us bloggers and I’m sorry the upgrade didn’t go as planned.
However, thank you DynamicsUser.net for offering me a new home for my blog.
The old blog posts will be transferred along the way.
Did I mention that my new book: “RoleTailoring Microsoft Dynamics NAV” is on its way and will be ready in November.
Until then, there are always the old ones:
Have you ever wondered why a postings land on a specific account?
An order was posted, and all of a sudden a posting lands on a new account and it is not one of the usual accounts.
I just posted a purchase order with one of the computer items:
Normally all I should see these postings:
1) Purchase Account 10000
2) VAT Account 2500
3) Vendor -12500
…and after the Inventory Batch:
4) Inventory Account 10000
5) Direct Unit Cost Account -10000
However, looking at the transactions in the G/L Registers I see the following:
This is fine, but the inventory batch has posted this:
Some might already know, what has happened but if I want to know what G/L account 7192 is used for, I can go to the G/L account card:
Again, the name of the account here might give a clue, but that might not always be the case.
Therefore, I use the Where-Used-List to see in which setups the account is being used.
The list is quite extensive:
Now we can go to the General Posting Setup and get an explanation:
By the use of the on-line help, it is possible to find out that it means the posting is due to the setup on the Item card:
Another very important use of this function is to secure that a G/L account is not included in any setup before it is deleted.
Now it is here, and it is FREE.
There has been quite a few requests for a successor to my previous book: Manufacturing with Microsoft Dynamics NAV. I have been deep into the code and decided that there are not enough improvements to justify rewriting to whole book, make new screenshots, proofreading and so on.
I have therefore made a whitepaper that is an amendment to my previous book “Manufacturing with Microsoft Dynamics NAV” and it fills the gap between the manufacturing module of Dynamics NAV 2013, which the book was based on and the newest Dynamics NAV 2015. The whitepaper is a mix between hardcore functional changes in the Dynamics NAV 2015 application, inspirational uses of the new user experience enhancements and a couple of “aha-experiences” I have had, working with the manufacturing module in real life.
Download the whitepaper free here:
· Http://b-a.dk: My own home page where you can buy a copy of the book: Manufacturing with Microsoft Dynamics NAV too.
· Http://mibuso.com: In the download section, please allow some time for the download to be approved
· Http://dynamicsuser.net: In the download section, please allow some time for the download to be approved
This was my very provocative opening slide on an internal seminar in Columbus NSC, held for all developers and consultants not too long ago and it might be a bit exaggerated, but not much.
Developers and consultants do not like the RoleCenters because the profiles never contains the information they need. Therefore, they are automatically drawn to the Departments page or to the search bar in the upper right corner of the RoleCenter.
The problem is that the developers and the consultants are not the ones to use the system later. When the implementation is finished and the dust settles, the users have to find their own way around. If that implies searching for functions in the Departments menu or trying to remember the name of a report only used once every year, then it is no wonder that the users find the system difficult to access and understand.
I have conducted more than 150 courses over time with no less than 800 consultants and developers and in most courses, I usually include RoleTailoring as bonus topic. The opinions vary but they are usually less than positive.
Some of the courses I have conducted have been at end-user companies after they have been running Dynamics NAV for a while. Here I teach the super users to use the RoleCenters and to create Profiles for the different groups of users. Then we usually end by making a number of two-hour workshops in the different departments, teaching the users how to change their own profile.
It is amazing the difference it makes in the attitude towards the system. The initial comments are “***-system”, “inaccessible” and “difficult” but after only a two-hour workshop, I get total different and much more positive comments.
How did that happen?
Well for starters, the course materials for Dynamics NAV 2009 included very little material on the RoleTailored client and explained it as if it was just an upgraded Classic client, and apart from a few partners, there have been no real focus on RoleTailoring. The course materials on RoleTailoring for Dynamics NAV 2013 was not much better; it was almost a copy of the 2009 material, although in the 2013 material the “Configuration” function was explained.
Secondly, the instructors are also consultants and I have heard more than one of my colleagues state that the FactBoxes take up too much space on the screen and the first thing they do is to disable them. The use of the Search bar by the instructor during courses is another example of bad habits transferred from instructors to participants on the courses.
What is wrong with the Search bar, you might ask?
The Search bar is a super tool for consultants, developers, administrators or super users who tend to jump between all the different parts of the application every day and more importantly users who know the names of the functionalities they use. However, for end users performing the same tasks every day or periodically, the RoleCenter must be the portal for all functionalities they need in their job.
Then how should it have been done?
Allocate the time
First of all: RoleTailoring takes time; and time in an implementation phase cost money. Therefore, it must be recognized as a separate task in the sales and implementation process - usually in the same area as the setup of security groups. The time consumption is not necessarily consultant time but more likely the super user or administrator at the customer. This way it is not necessarily an extra cost. However, the extra implementation time must calculated in the time estimated for employees at the customer.
Prepare the profiles up front
The RoleCenters and profiles must be prepared and allocated so that the end-users see their future profile already from the initial meeting with Dynamics NAV. If the users are forced to work with a different RoleCenter from the beginning, they will develop bad habits, which will be very hard to change later. The worst thing however, is that the user will not see the benefits of the RoleCenter from the beginning. Imagine a bookkeeper who is forced to use an Order Processor profile because it was the default. It will be almost impossible to relate to the tasks and functions shown on the RoleCenter and they will automatically consider the system to be more focused on sales than finance.
The first task is to clean up the RoleCenter and thereby remove all functionality that is not necessary:
· Remove all unnecessary fields and actions; this will also help the users to focus on the relevant functionality.
· Hide fields not commonly used
· Remove FactBoxes that does not add value
· Add FactBoxes the will add value (Develop new ones if needed)
· Clean up the fields in the FactBoxes
· Add all needed functionality to the RoleCenter
· Use “Filtered lists”
· Use Charts
· Use Quick Entry
Where do we go from here?
To spread the word, Columbus NSC have arranged internal seminars and online meetings focusing on the issue. Likewise, we are arranging seminars for customers and end-users. For the consultants and developers, I will conduct a preconference workshop at NAV TechDays 2015 in Antwerp:
In addition to that, I am finalizing my second book: “RoleTailoring Microsoft Dynamics NAV”, which will be included in the pre-conference fee and given to the participants as an eBook. A preview of the book will be available soon.
Lastly, this will probably not be the last post I write on the subject. So happy RoleTailoring – and I will see you in Antwerp.
One of the topics I address on nearly all my courses is RoleTailoring. How to set up profiles and how to assign them to users. The process of configuring and personalizing the profiles is also a hot topic and during the implementation process, it is recommended to create special shortcuts for each active profile in the company in order to maintain each profile.
Each of these shortcuts points to a specific profile and it can also point to a configuration file containing information on:
· Server name
· Instance name
· Port number
· Credentials type.
An example of the destination of the shortcut could look like this:
"C:\Program Files (x86)\Microsoft Dynamics NAV\70\RoleTailored Client\Microsoft.Dynamics.Nav.Client.exe" -profile:"MACHINE OPERATOR" -settings:"C:\Users\Administrator\Desktop\Manufacturing\ClientUserSettings.config"
The –profile parameter refers to the desired profile name and adding the –configure parameter will start the client in configuration mode.
When starting up the client with the –profile parameter, it is the permissions of the user running the shortcut that count. It is possible to run the shortcut as another user, by right clicking the shortcut with the shift key pressed and selecting “run as a different user”.
In Dynamics NAV 2013, the user running the shortcut must be owner of the profile in order to start a profile in configuration mode.
This has also been the usual way for me to open profiles, but it becomes a bit tedious to have to create shortcuts every time it is necessary to configure a new profile.
On one of my courses in Jutland, a creative student (I’m sorry I don’t remember your name) provided a simple solution for this:
Why not open the profile directly from the profile list?
So he designed this simple solution:
In the Development Environment go to the Profile List page (9171) and perform the following
1) Design the page
2) Click View/C/AL Globals
3) Add a new function called OpenClient
4) Click Locals and create one parameter
5) And one local variable
The full Subtype must be (can be copy/pasted):
System.Diagnostics.Process.'System, Version=184.108.40.206, Culture=neutral, PublicKeyToken=b77a5c561934e089'
6) Now Go back to the function and press F9 (Design) and type the following:
7) Now it is necessary to create two new actions on the Profile List:
a. Go back to the Profile List in design mode:
b. Click View/Page Actions:
c. Add two new actions:
d. On the “Open Profile” action, press F9 and type:
e. Go back to the actions and switch to the “Configure Profile” action press F9 and type:
With a little documentation, it could look like this:
8) Press Esc until prompted to save the object
From now on, all you need to do to open or configure a profile is to open the Profile List, find the DISPATCHER profile and click Open Profile:
…and the DISPATCHER profile opens:
…or to open it in configuration mode:
I used the property Image WorkCenter for the Open Profile action and the property Image Setup for the Configure Profile action.
Believe me – Your life will not be the same again.
Have I been awfully quiet lately?
That might be because I have been preparing a one-day pre-conference workshop I am going to conduct on this year’s NAV TechDays in Antwerp, Belgium.
You can read more about it on the NAV TechDays webpage.
The topic is:
RoleTailoring Microsoft Dynamics NAV
Strangely enough, that is also the title of my new book. It is almost done and will be available in your favorite web-shop this summer. I will release a free preview version in November.
The content of the book and the workshop will be something like this:
You can sign up for the pre-conference workshop at NAV TechDays already in May. Remember to keep an eye out for the new book. You can start with the free preview, which will keep you going until the final book comes out.
Going Fishing in the MiniApp
As I described in my previous blog post, it is possible to reuse functionality from the MiniApp in the ”real” Dynamics NAV.
Here is another example:
In Page 1302 Mini Item Card, there is a sweet little function to save an existing item as a template:
Ok - Go steal
First, go to the development environment and design page 1302 Mini Item Card.
Go to View/Page Actions:
Copy – Exit and do not save if asked.
Then into page 30 Item Card and into page actions:
Paste next to the Apply Template Action – document the change – save and exit.
Now open the item card on an existing item:
Save as template:
Make a new item:
Enter – and apply a description
It’s Sooooo Easy – 3½ minutes including documentation.
Exists on the Mini Customer Card and the Mini Vendor Card as well.
Microsoft launched the MiniApp with Dynamics NAV 2013 R2.
The MiniApp was not advertised widely in the Dynamics NAV community, mostly because the MiniApp was the essential part of the Microsoft Dynamics C5 2014 rollout. Dynamics C5 is a local Danish product and therefore not very interesting for the NAV people in the rest of the world. Later the MiniApp has been the topic of sessions in conferences around the world for its design patterns, but it is still not very well known.
The MiniApp consists of a separate set of objects and there are many interesting functionalities to steal – ups sorry – to be inspired by. One of the cool things is that the RoleCenter has been expanded with a page part showing financial data.
To see it, first find the SMALL BUSINESS profile and to make it default or assign it to yourself:
Then restart the RoleTailored Client to show the Small Business RoleCenter:
The interesting part is the Trial Balance in the lower right corner. Using F6 to change focus to the Trial Balance window, Ctrl+Alt+F1 can be used to find the object number in the page (Some systems already use the Ctrl+Alt+F1 key - In that case Ctrl+Windows+Alt+F1 keys can be used instead):
Bingo: page 1393.
Now, let us see if we can figure out how to fly this thing.
Here we have two interesting menu items:
Hmmmm not enough so let us try the customize menu item:
Nothing here – time to dive into the code:
Following the Code into the MiniTrialBalanceMgt codeunit:
Ok so this is actually a page and a codeunit to show an Account Schedule on the RoleCenter. But what Account schedule? There are no official page to change the MiniTrialBalanceSetup data, but that can soon be fixed:
So the Account Schedule is the I_MINITRIAL. Let us check it out:
And the Periods:
So now we are in familiar waters. All we need now is to implement the Page 1393 onto our own RoleCenter, and make an Account Schedule that fits our needs. I am going to implement it on a copy of the President profile:
Assign a profile to the new object number and assign it to the president:
With a little imagination, there are dozens of uses for this little functionality:
- Key numbers for Management
- Department balance for the Department Head
- Sales reporting for the sales people
- Purchase reporting for the purchasers
The setup could be attached to the RoleCenter showing different Account Schedules for different roles or even users.
Looking at the Small Business RoleCenter, one could also consider using the Key Performance Indicators page part in the same way.
As I see it, the MiniApp can make a Maximum impact.
Many consultants have more than one version of Dynamics NAV installed. End-users in a migration phase will also often have more than one NAV installed. In that case, the “installed” version can be different from the executed version. If this is the case, an error like “The client version does not match the server version” occur.
In this case it is because I am trying to run the Sessions list from the RoleTailored Client in the Dynamics NAV 2013 R2 but the registered client is pointing towards to the newly installed Dynamics NAV 2015 version. This can be changed relatively simply, but even more important, it is possible to make a set of files to change the desired RoleTailored Client back for forth between the different versions.
The change must be made in RegEdit and the simplest way to make the file is to go to RegEdit and search the key that needs changing:
Inside Regedit, locate the key:
The content of the Key is something like this:
C:\Program Files (x86)\Microsoft Dynamics NAV\80\RoleTailored Client\Microsoft.Dynamics.Nav.Client.exe -protocolhandler "%1"
Now right-click on the command key and choose export:
Make a couple of copies of the files:
Now right click each file and change the 80 to 71 or 70 depending on the version:
Windows Registry Editor Version 5.00
@="C:\\Program Files (x86)\\Microsoft Dynamics NAV\\71\\RoleTailored Client\\Microsoft.Dynamics.Nav.Client.exe -protocolhandler \"%1\""
Hereafter all it takes to switch is to click one file and Answer yes to the warning:
Now debugging is possible in all installed versions after preparing the environment a little.
Dynamics NAV 2013 introduced a new debugger to replace the old and worn-out debugger in the classic client. The classic debugger in Dynamics NAV 2009 did not support the RoleTailored Client and that caused a lot of swearing and a lot of work-around.
The new debugger has some new functionality that has been on the wish list for many years:
· Watching variables in all levels of the call-stack
· Debugging NAS’es
· Debugging client sessions
· Avoiding the annoying Caption Class translation in Codeunit 1
· Creating conditional breakpoints
· Everything is coded in C/AL, which means that it is possible to change the existing functionality and to add extra functionality.
Enabling/Disabling the Debugger
First of all the debugger is no longer connected to the classic client or the development environment. It is connected to a session. A session can be:
· RoleTailored Clients
· Web Clients
· Tablet Clients (Only Dynamics NAV 2015)
· NAS Services
· Web Services
· oData Services
This means that it is possible to debug any of the above. In the settings of each session, it is also possible to prevent debugging of a session.
Two settings seem to affect the debugger:
Debugging Allowed: If the field is not selected, it is not possible to debug the session.
Enable Debugging: When the client first connects, all C# files for the application are generated. It has therefore nothing to do with the debugger we are going to work with.
To prevent debugging in the live environment, it is recommendable to keep the Debugging Allowed clear in all live sessions and create special sessions for debugging purposes.
Starting the Debugger
There are three ways to start the debugger:
Start from Development Environment:
The menu item Tools/Debugger/Debug Session will bring up a list of the active instances/sessions available to debug. It will only show instances/sessions connected to the database that has been opened in the develop environment.
It is possible to “Hard Code” which instance the development environment is connected to by going to Tools/Options and type in a different server/instance name:
The session list will show all the different sessions and client types currently connected to the instance.
Start from the RoleTailored Client:
In the RoleTailored Client, first it is necessary to connect to the instance that needs debugging. Once in, go to the Sessions menu item located at Departments/Administration/IT Administration/General/Sessions (or cheat by pressing Ctrl+F3 and type sessions).
Start from the Run command:
Lastly, it is possible to start from the run command:
Notice that it is not possible execute the command from the search field in Windows 7, it must be the Run command.
The sessions list
From the sessions list it is possible to activate the debugger depending on what to debug:
An error is occurring in the selected session, which by the way could be operated by another user.
A breakpoint has been set in the code in the development environment or previously in the debugger, and the debugger will stop on the breakpoint. There are also other breakpoint options but starting up it is only possible to set the breakpoint in the code in the development environment.
Stop on the first error/breakpoint in ANY of the active sessions. This is a little bit dangerous since any error made by any unsuspecting user will trigger the debugger, and for the user the session will “hang” all of a sudden while you take over their session. Later we will go through some of the uses of the Debug Next function.
Start Full SQL Tracing:
This is used when tracing the performance in the SQL profiler. This can be a topic of another blog post.
Setting the breakpoint in the Development Environment
From the Development Environment it is possible to set the breakpoint in the code somewhere. This however is only possible if the license includes Solution Developer. Just go to the object that needs debugging, find a place in the code and set a breakpoint by pressing F9. There is also a menu item Tools/Debugging/Toggle Breakpoint
First time the F9 is pressed a Red dot will appear next to the line:
Pressing twice the mark remains as an inactive breakpoint just to keep track of where the breakpoint used to be, but the debugger will not stop here.
How to activate the debugger if the license does not include Solution Developer
It is still possible to activate the debugger without license for Solution Developer. It is just a little more complicated. When the debugger is activated, the debug page is shown, however no breakpoint has been set and the debugger will therefore not be triggered unless an error occurs. If however, the Break button is pushed right after the debugger is activated, the debugger will stop at the next code that is executed.
This is also the way to stop the code in the middle of a user action. For example when the warning for exceeded credit limit and overdue payments is shown:
Click the Break and click Yes in the warning – and “Got You”:
Now the debugger has stopped right after the Credit limit warning.
The Debugging Window
The debugging window looks more or less like the old one:
The page consists of three sections:
· The Code page
· The Watch FactBox
· And the Call Stack FactBox
In the code page, it is possible to see the code around the breakpoint and to see the actual position of the cursor. The new stuff is that it is possible to see the value of the variables just by hovering the mouse over the variable:
If there are more than one instance if the variable or if exists in different scopes, all instances will be shown.
The little + and the glasses enables you to add the variable to the Watch List FactBox. After clicking the glasses the variable appear in the Watch List until it is removed again. If something else is debugged, it will be shown as out-of-scope.
Now the value of Posting Date is visible to me throughout the debug session.
The Call Stack FactBox shows the path which the system has followed to end up at the breakpoint.
Clicking one of the lines below will show the function that the system executed to get to the present position. The Green arrow in the code page indicates that the position is not the actual one. Combined with a third window, the variables, it is possible to see the state and values of all variables, at the time of entering the function.
A second place to add a variable to the Watch List is by clicking the variable in the Variable window and then the Glasses in the upper left corner.
A new functionality is the conditional breakpoint. In the “Old Days” a little code snippet was inserted in the code:
Then the breakpoint would be set on the MESSAGE(‘Hi’) line;
This is no longer necessary.
The conditional breakpoint can be created like an ordinary breakpoint. It will be shown as a breakpoint with a + inside:
Most simple datatypes can be created as conditional breakpoints. Exceptions are date, time and datetime fields. The only thing is, that the conditional breakpoint can only be set in the debugger window. Therefore, a normal breakpoint is usually first set to activate the debugger, then the conditional breakpoint can be set to replace the normal breakpoint.
Disable All Breakpoints
It is easy to get carried away setting lots of breakpoints in many different levels. Therefore, it is also good to be able to remove some or all breakpoints. The function Disable All will disable all breakpoints including conditional breakpoints. The disabled breakpoint can be enabled again by pressing F9 on the breakpoint.
Another function is the Breakpoints function, which will open a new window with a list of all breakpoints:
Here the state of each breakpoint can be seen and it can be enabled, disabled or deleted.
In the Break Rules window a number of great new properties appear:
If the debugger should not stop on errors, the check mark can be removed here. Now the debugger will only stop if a breakpoint has been set.
A check mark in the Break On Record Changes field will cause the debugger to stop first time a record changes.
Lastly, the debugger has been defaulted to skip all code in codeunit 1. This is to prevent the problems in previous version with the circular debugging due to the caption class functionality. If the debugging should include codeunit 1, the check mark must be removed here.
Starting/Stepping/Stopping the execution of the code
The functions to control the debugger are almost the same as in previous versions but with a number of welcomed additions:
Step Into (F11)
Execute the next line of code. If the line includes one or more functions then the debugger will step into the function showing and debugging the code in the function.
If there are more than one function in the line, the functions will be executed according to the precedence rules.
Step Over (F10)
Execute the next line of code. If the line includes one or more functions then the debugger will execute the code in the function like normal but remain on the line.
Steep Out (Shift+F11)
If the debugger stepped into the function by mistake or it is decided that there are no code in the function relevant to the problem that is being debugged, the Step Out button will execute the remaining code in the function like normal and then return to the line that originally called the function.
This will quit the debug session and execute the remaining code like normal. All breakpoints will remain and the debugger window will remain open ready for the next debugging.
No matter where the cursor is, the debugger will stop at the next line of code that is being executed.
Quit the debug session and fire an error and full roll back. All breakpoints will remain and the debugger window will remain open ready for the next debugging. This is particularly important, if the test scenario is difficult to recreate e.g. first time postings are made to a new item. By stopping the debugger, the test data will remain intact for future debugging.
Show Last Error
Will show the error that triggered the debugger.
Debugging Web services
Debugging Web services is a little trickier because of the relatively short time the web service session exists. When the session only exists in a few seconds, it is not possible to click debug on the session in the Sessions window. In this case, a separate instance only for web services and the run command will come in handy:
dynamicsnav://localhost/Webservice80/CRONUS Danmark A/S/runpage?page=9506
Now the session window will start up empty:
Now press Debug Next, and the debugging window will start up as normal. Next, it is possible to either press Break or wait for the debugger to break on a breakpoint previously set in the code the web service is supposed to execute. If the web services instance is the same as the instance for the NAS and the Windows Client, it is not recommended to press Break since any code executed by any client will trigger the debugger. In that case, a breakpoint in the web service code is the only option. It is also recommendable to disable break on errors, since any error made by any client will also trigger the debugger.
The best solution though, is to create a separate instance only for web services.
Debugging NAS services
In previous versions, it was a little difficult to debug the NAS. In the versions from 2013 and forward, all it takes is to log on to the NAS service instance with the Windows Client (or to run the Services window with the NAS instance name like described previously), start the debugger and use the same procedure as described above.
We all know the situation: A customer needs a copy of the live environment to run a number of tests, or it could be that a new test/development environment must be made periodically. Often it is necessary to update the test company with the latest data. Therefore, it is necessary to create a copy of the live database and restore it with a different name. Having done that, it must be prepared so that data from the test company cannot be confused with data from the live company.
The preparation can involve quite a number of things:
The company must be renamed in Companies.
Rename the Company Name
The company name must be renamed in Company Information so that all sender addresses on external documents clearly reflects that is origins from a test company.
Change the System Indicator
I also change the System Indicator to tell the user that they are in a test environment, both by changing the text shown in the corner to show the word TEST clearly and to show the date the test company was based on, this will help the users later running tests in the system.
Giving this signature in the right corner of each page in the Windows client:
It is also necessary to redirect the printers so picking lists from the test system are not suddenly printed in the warehouse or production orders in the production department. Even worse would be that external documents like purchase orders, sales quotes or orders accidently are printed and sent by mistake.
Redirect Freight integration
If integration with the shipping company is automated, then it is advisable to make sure that it is disabled in the test company. Otherwise, you will face some very angry truck drivers.
Stop Job Queues
There is no need to keep all jobs queue entries in the Job Queues running, like automatic posing of invoices or printing of invoices. So stop all the jobs that will give problems in a test environment.
Stop EDI flow
If the company communicates with external partners with EDI or the like, it is necessary to redirect all documents to a test folder.
Prevent Electronic Invoices
Many companies send invoices electronically, the paths that are uses for sending and receiving invoices and credit memos must be changed so nothing is sent or received by mistake.
Stop Electronic Payments
Payments sent and received are usually initiated manually, so the chances for mistakes are few. However, it is best practice to change all paths so the payments are not imported or exported to the bank by mistake.
Redirect Captured Documents
Capturing purchase invoices are commonly used in companies, and the paths used for the captured documents must be changed.
Stop or redirect Inter-Company Flow
If the company a part of a larger organization with an automated IC-flow, it is very often desirable to make sure that there are two IC-flows, one for the live transactions and one for the test environment, otherwise it can be difficult to test the Inter-Company functionality. In any case, the functionality must be redirected so the flow either stops or runs on the test environment instead.
Stop or redirect Master Data Replication
If the company a part of a larger organization with an automated Master Data Replication flow, it is just like the IC Flow desirable to make sure that there are two replication flows, one for the live replications and one for the test environment. The functionality must be redirected so the flow either stops or runs on the test environment instead.
Automate the tasks
These are only some of the changes that needs to be done. Others could be integration to production facilities, machines, BI databases or CRM systems. The list can be endless.
Now the customers will not be happy if they have to call a consultant to perform all these tasks, not to mention the actual backup/restore functionality every time they need a new test environment and even worse, to remember all the different tasks that needs to be done after the restore. Secondly, the number of databases that must be copied in a large organization can make the task so complex that errors will eventually happen.
Therefore, I have automated the creation of the test environment for my customers. It comes in two or three parts depending on which Dynamics NAV version the customer run:
1) The backup of the live database and restore to the test environment
This is not covered here in detail, but it can be performed in many ways:
- Manually from SSMS
- Automatically with a SQL job
- Automatically with PowerShell (SQL Server 2012 & SQL Server 2014)
2) The Company must be renamed in the test database. If the customer run Dynamics NAV 2009 or 2013, it must be done manually, but in Dynamics NAV 2013 R2 or 2015 there is a PowerShell CmdLet for exactly that
3) All the rest must be performed inside the company in NAV. Build a codeunit to handle all the changes. If the customer run Dynamics NAV 2009 or 2013 it must be run manually, but in Dynamics NAV 2013 R2 or 2015 there is also a PowerShell CmdLet for to execute the codeunit
If the test company is in the same database as the live company, a simple PowerShell script can be run to insure that everything is done properly. This will work in Dynamics NAV 2013 R2 and 2015.
This is the script for real Men; it removes the company NO QUESTIONS ASKED. A gentler version could be without the –Force parameter on the Remove-NavCompany command. This will politely ask before removing the company. Of course, this could be done manually, but if we face 30 or 50 companies then the script is THE solution.
Creating the codeunit, usually involves a lot of “hard code”, here is an example:
“Why can’t I post charges directly to a production order without making special operations and create the vendor as a subcontractor on a work center” is a question I often meet.
The answer is: You can do that, it just takes a little preparation and design access to the tables.
On a purchase order, it is possible to see which production order number and production order line has generated the purchase order. This is normal way when it is used for subcontracting. The information is set automatically when the purchase order has been created from the subcontractor worksheet.
However, sometimes charges for, e.g., freight are billed on a separate invoice from a vendor different from the actual subcontractor. In that case, it is convenient to be able to post a purchase order that is charged directly to the production because Item Charges cannot be used with purchase orders from subcontractors.
The three fields on the purchase order line defining the subcontractor are the Production Order No., Production Order Line No. and Work Center No.
By default, the three fields are not editable in the page.
To enable it, perform the following simple change:
1) Go to the development environment
2) Open the database, click the Table menu item
3) Navigate to table number 39
4) Locate the three fields numbered 5401, 99000752, and 99000754
5) On each field go to the menu item View > Properties
6) Find the property Editable and change it to “Yes”
7) Save and exit.
Other fields are also available on the purchase order line and can be included, but the three mentioned are the only ones necessary.
Now the three fields are editable on the production order. Then I need to create one new Work Center for charges. The vendor chosen for the work center can be any random vendor the field just must be filled. An existing work center can also be used as long as it is a subcontractor.
Now I am ready to post charges directly to production orders.
The item number used on the purchase order must be the produced item from the production order.
So I create a purchase order with the vendor number from my freight vendor, the item number from the produced item and the new work center.
Amazing how versatile London Postmaster is.
Then I fill these three fields:
Post the purchase order and Navigate tells the story:
No Item Ledger Entries only Value entries. No items will be received. The cost will be treated as a service charged directly to the production Work in Progress (WIP).
From the production order, it looks like this:
In Finance it looks like this after the Inventory batch has run:
Here the cost ends up on account: 2140, which is the WIP account.
Keine Hexerei, Nur behendigkeit
More Posts Next page »