As you might know, security related topics are one of my favorites. It is not only about the technical stuff how to configure the roles. It is also related to e.g. segregation of duties to prevent possible fraud. It is always fun to work together with a customer to get the best possible setup which is manageable for the application support department. One of the tasks is moving new security configurations between environments.
In this post I will tell you how to do this, but also a neat trick how to rename newly created security objects as the Security configuration form is not supporting this.
It is possible to do changes related to the security in several ways:
There is a slight difference in Dynamics 365 compared to Dynamics AX 2012 for managing the security. In AX 2012, all security changes ended as security objects in the Application Object Tree (AOT). So, using several code deployment options, the changes were applied on e.g. the production environment. Development changes in Dynamics 365 needs to be treated similar. Using a deployable package, you can move the objects to a live system.
New in Dynamics 365 is the configuration on runtime. Changes are kept in the database and when publishing the security changes, objects are changes on runtime only. There is no link with the development environment. This does not mean that you would change the security configuration on your production environment directly. Changes needs to be tested first. As there might be a lot of changes which needs several test cycles and reconfigurations, the logging will be dirty and not very readable if there are too many changes. So, like other developments, it is recommended to start in a dev/test environment. The production environment will only get a cumulative, approved security configuration which is logged only once.
When you have changes in security configuration which needs to be moved to another environment, you can simply use action buttons on the security configuration form. I will show the steps in detail.
To start with, I created a new security role: The Sales Assistant. This person is able to view customer details and allowed to maintain sales orders. When this role is tested and working as expected, we want to move it to another environment. To do so: Click Data > Export. The security changes will be exported in XML format which can be downloaded. You can store the configuration file within an own library.
Now you can open another environment where you can import the security configuration. Open the Security configuration form. Then click Data > Import. You will be prompted to specify a file name. The file you just saved would be the one which needs to be provided. Then click on OK.
The file will be processed and the changes will be visible on the form. On the Unpublished objects tab page, you will find all modifications.
In this case, I used standard duties. If there were more changes, you could see more objects and object types. Note that exporting the security configurations will export all configurations; not only the one you are focusing on. To be able to have the imported configuration active, you have to Publish all or select one or more records and Publish selected.
When you have made a mistake in typing a friendly name for a new security role, duty or privilege, there is an option to create a new object and use the correct name or spelling. However, if you are live, the references are based on internal ID’s. There is no option within the security configuration form to change the name once the object was created.
There is one simple trick which I tried once and this is working like a charm.
As described above, you can export the security settings and save the file on your computer. Then use an (XML) editor to open the file. When you don’t have such software, you can even use Notepad.
Note that the name of the role is a GUID ID. This ID is the primary key for letting the import know if a new role should be created or an existing one should be modified. All newly configured security objects do have such GUID where objects created in Visual Studio would have a readable object name.
When you have saved the file, you can import the file and publish the changes to correct the name(s).
That’s all for now. Till next time!
Microsoft Dynamics CommunitySubscribe to this blogger RSS FeedMy book on Merging global address book records in AX 2012
The post How to: Move security configurations across Dynamics 365 environments appeared first on Kaya Consulting.
The Microsoft team did release platform update 13 for Microsoft Dynamics 365 for Finance and Operations, Enterprise edition. One of the new features is Custom fields. This feature lets a power user or system administrator configure new fields on existing tables without using the development environment and deploy it on the other environments. It is a neat feature which can be of interest, but you also have to be aware of all pitfalls and downsides. In this post, I will tell you more about the custom fields.
When you are in the need for some additional details on a customer, vendor or worker, usually you would ask a developer for creating fields in the tables and add them on the form using extensions. Then a deployable package is used to move the changes to your production environment. With the new feature in platform update 13, you can now configure the fields. The reason for adding this feature by Microsoft is to provide an easy and cost-effective solution to add simple details to existing tables and use them within Dynamics 365 for Finance and Operations. Using the configuration option, you could save on development, testing and code life cycle.
The process is very easy. Of course, not all users can do it themselves. You have to be a system administrator or have the role Runtime customization power user assigned. It is described on the next documentation page: Custom fields
The next screenshots show some of the steps to create a new field on the customer table which represents a legacy/old customer number. As mentioned in the documentation, start with personalizing the form:
There are options to define your custom field:
Once completed and reloaded the form, you can maintain the new field:
When configuring fields, you cannot control things like indexes and coding to validate or automate entry. These options are available when a developer will create a field in the meta data. When a field is created in the meta data and deployed on all environments, it would also be possible to include this field in new or customized reports. So, it is not possible to configure a field and have the field visible in the development environment for reporting. It is possible to add a configured field to data entities on runtime.
Within an implementation, you have to careful consider if you want to configure a field or needs to have more control options and let a developer create an extension. Fields in the application metadata are more sustainable. In advance, you should know the purpose of this field.
Some important limitations of configured custom fields are:
It is possible to include the new configured fields on existing entities using the Custom fields maintenance form. When enabled, you can use the field in the Data Management features and Open in Excel. Using data management, you could use reporting tools like Power BI to expose your custom field.
When you do configure a custom field, Dynamics 365 will create runtime table extensions and also a physical field on the table in the database.
There are some limitations defined by Microsoft. These constraints were added to prevent issues like performance degradation. The next rules are valid for tables (taken from the documentation):
When creating a picklist field, a list with possible values are maintained in a table which will be used as lookup when entering the data.
I have tested the custom fields on a table of our own add-on which was also working correctly. Technically, Microsoft did a very good job here.
Also, there is a maximum on the number of fields to be added on a single table using the configurable custom fields option. The maximum number is 20. When creating fields, be careful with lengthy text fields. Too many longer text fields will cause growth of the record size which could lead to performance penalties. Though, it is a bit strange that 20 fields are allowed, where the best practice check in Visual Studio will give a warning if you create more than 10 in one extension.
As stated, there are limitations and you should not provide access to every user to be able to create their own fields. I do think configuring the custom fields should be considered very carefully. Showing the custom fields are presented to the user as form personalization. If a system administrator will push a new “form layout” to all users, earlier created personalizations will be overwritten. Also, a user could easily remove the field himself or reset the personalization without knowing he is losing the new field. So possibly development would be more expensive, but there is a better governance that it will not have negative side effects. If you run into problems, troubleshooting is also costing money…
As mentioned above, searching on non indexed fields and too many large text fields can cause performance issues.
A fellow MVP, Ievgen Miroshnikov, found another danger. A developer could add a table extension ending with _SysCustomFields and add the field with exact the same name (in my example OldNumber_Custom). Then deploy it to an environment where the custom field was created. Assuming the custom field record is not required anymore, delete the field from the configuration. It will then forget that it was included in a deployable package and the field will be physically removed from the database. So, never, ever use fields as developer end with ‘_custom’ and don’t use table extension names ending with ‘_SysCustomFields’. Usually we would not use these keywords, but thinking to “move” the configuration to development based extensions could lead to assumptions and one attempt. So, you are alerted now! Read below for a work around…
Like stated above, it is not possible to move the custom field one-on-one to Visual Studio. The best would be developing a complete new field as extension (different naming convention). Also extend standard data entity with the new field. Once these changes are deployed , you have an environment with two fields for the same. Export the data using data management. Then you can import it into the new field. Once you have checked this was successfully completed, you can delete the custom field.
It is not documented by Microsoft, but you can enable database logging on the custom fields. Just like any ordinary field, you can select the custom field in the list for updates logging.
If you want to learn more about this feature, I will host a MVP session for the AXUG on this topic on Monday, February 12 at 9 AM EST (3PM CET).
The post Custom fields: handle with care! appeared first on Kaya Consulting.
As of several versions of Microsoft Dynamics AX there is a feature to setup and use temporary vouchers for journal entry. Also, this feature still exists in Microsoft Dynamics 365 for Finance and Operations, Enterprise edition. In this blog post I will tell you about the temporary voucher number and namely how you can benefit from this feature.
There are countries/regions where it is mandatory to use continuous voucher series. While saying continuous, it should be really continuous. When you do setup a number sequence to be continuous it is still possible to get gaps in posted vouchers. A lot of accountants did complain about the gaps. A possible cause for these gaps can arise in the next situation.
Assume user A and B are entering general journals. Voucher GJV0050 was the last posted voucher number. User A creates lines in a journal using the next voucher number: GJV0051, GJV0052, GJV0053 and GJV0054.
User B also creates a journal with voucher numbers: GJV0055, GJV0056 and GJV0057.
Now User B did check his journal and posted it. User A decides to delete the rows as he discovered the transactions were already posted before. Now we do have a gap in posted voucher numbers.
For some countries/regions it isn’t a problem as it would be allowed to reuse the ‘gaps’. Some countries do have regulations that per period the numbers should not have any gaps, so using numbers GJV0051, GJV0052, GJV0053 and GJV0054 in a period later than the already posted numbers are not allowed.
To overcome this issue, you can work with temporary vouchers.
When using temporary vouchers, the journal lines to get a voucher from the temporary voucher series. In the example above, the users would create the next lines:
User A: TMP0078, TMP0079, TMP0080 and TMP0081.
User B: TMP0082, TMP0083 and TMP0084.
During posting the temporary voucher numbers are being replaced with the correct voucher series for the journal. So, when user B will post his journal, the vouchers will be correctly replaced with GJV0051, GJV0052 and GJV0053. Per temporary voucher number a new voucher from the series will be retrieved. As you will notice now, there are no gaps anymore in the real voucher series.
There are two settings required to be able to work with temporary voucher numbers. The first setting is the Temporary voucher number sequence on the General Ledger parameters form.
The second setting is the Number allocation at posting on the Journal names. Per journal you can decide to enable the use of temporary vouchers or not.
When you do create journal lines, initially it will use the temporary voucher number as you can see in the next screenshot.
During posting of the journal, it will replace the temporary numbers with the actual voucher numbers.
When you have to work a lot with importing journal lines, it would be possible to use e.g. the Data Import Export Framework. Custom import scripts can also be used. In both ways, you have to take care of the correct voucher numbers. For these scenarios, also the temporary voucher is a great aid.
You can provide manual assigned numbers in your source file which could be in fact also against the setup of the temporary voucher number sequence. Per journal you can start over with voucher number 1 if you want to. Whatever you provide, during posting each unique provided voucher number will be replaced with the final voucher number retrieved from the journal setting.
If you imported lines manually and try to delete them as they were incorrect, there is a check if the temporary voucher number meets the format of the temporary voucher sequence. If it doesn’t match, an error will be raised.
This error sounds odd as you only want to delete not posted journal lines. The reason for this is that if you did setup the temporary voucher number to be continuous, it should put the number in the pending number sequence list. That is not possible if the format is different.
For this reason, probably avoid using formats for the temporary voucher series, so that the numbers will be natural numbers like 1, 2, 3, etc.
There is one disadvantage to mention. If there is an error during posting, it will give you the error with the replaced voucher number. The error causes the journal lines to get the old values, so with the temporary voucher numbers. For this reason, it is hard to find the row(s) which contains the error in larger journals.
The post What is the temporary voucher functionality in Dynamics 365? appeared first on Kaya Consulting.
This post will inform you about the Entity execution parameters which can be found on the Data import/export framework parameters form. It will give you an option for gaining performance when importing large number of records using the data management features.
The entity execution parameters do have an option to provide information how to divide the workload when performing data import using the batch framework. In Microsoft Dynamics AX 2012 we were used to specify the number of tasks when creating the processing group. In the current product it can be defined in upfront as a parameter.
Use the next path to open the Entity execution parameters: System administration > Workspaces > Data management; Then click the Framework Parameters tile. The Data management workspace can also be started from your default dashboard.
On the Framework parameters form you can select the tab page Entity settings and the button Configure entity execution parameters to open the next form.
As you can see in the picture above, you can specify a threshold per entity when the tasks need to be split and how many tasks will be created then. You can also make variations for the same entity. So, you can use 4 tasks for smaller and 8 tasks for larger files.
When specifying these parameters with a task count, it will create multiple threads where the workload will be divided and run in parallel. From my experience, this can make a huge difference in performance during data import.
To be able to really use these settings, you have to run the import using the batch framework. The Import button on the Import project form, will start a single thread on the server. You can use the Import in batch button on the Import options menu.
Then complete the parameters on the batch slider to have the batch scheduled. When the batch job is executing, you can actually monitor how many tasks are created on the View tasks form. From the Execution details, you have a link to the batch job. On this form, you can browse deeper into the tasks for this job.
Some data entities like Fixed Assets, are not created to support the use of multiple tasks. When trying to setup these entities, you get an error like shown in the screenshot below.
The post Speed up data import with (data) entity execution parameters appeared first on Kaya Consulting.
It’s been a while since I posted a blog post. I have been extremely busy and had barely time to sit down and finish some posts. During my work I collected some experiences and made some drafts, so I will be publishing some more blog posts, for sure. One of the things that took some time from me was testing platform update 11 before it could be released. This platform update was released October 6, 2017 and mainly has some technical improvements. The majority is related to support more options for extending the application to prevent overlayering. One new item is a functional improvement: a Document count indicator.
In the earlier days of Microsoft Dynamics AX, there was a small indication if a record did contain attachments (notes, files, etcetera). The document button was highlighted or was a bit sunken. This pattern didn’t fit in the user experience of Dynamics 365 and was not implemented. Now in the latest platform update 11 there is a new document count indicator which will show you the presence of documents.
When you now open e.g. the vendor list page, per record, you can see the number of documents attached to it.
The count of document is a good addition, in my opinion. Now you can see if there are attachments, but also the number. Somehow, this experience makes me think of a web shop basket :). When adding a new attachment, this count will increase accordingly, showing then a new count:
When there are over 9 documents, the count will show only’ 9+’ as e.g. ‘325’ would consume too much space. Of course it makes sense, but hovering the button will still show ‘9+’ where there is space enough to display the real count. Personally, I do think the real count would give more value to this feature. For now, I’m confident this feature will satisfy the need from many customers.
When you want to learn more about platform update 11, you can visit the page What’s new or changed in Dynamics 365 for Finance and Operations, Enterprise edition platform update 11 (October 2017).
Microsoft Dynamics AX CommunitySubscribe to this blogger RSS FeedMy book on Merging global address book records in AX 2012
The post New document count indicator in platform update 11 appeared first on Kaya Consulting.