My last post was about how I got the customized data out of the tenant database into Xml files. That tenant database was from a NAV 2016 application.
I have updated the tenant database to Business Central and I need to bring in some of the data from these Xml files.
My first issue was that I needed to make these Xml files available to Business Central. I have been using Azure Blob to store files for some years now. I had both AL and C/AL code that was able to connect to the Azure Blob REST Api, but that code used DotNet variables that is no longer an option.
I did some preparation last year, when I requested Microsoft to add some functionality to the BaseApp. Using that BaseApp functionality I was able to redo my Azure Blob AL code as a clean extension.
I also wanted to put the AL code somewhere in a public place for everyone to see. And GitHub is the default code storage place. I created a project for Business Central AL.
I am hoping that this place can be the place where code examples for our Business Central community is shared and maintained. If you want to contribute then I can add you to this project, or I can approve your pull request.
I need to write another blob post about that Azure Blob and the other repositories I have created there. Hope to find time soon.
There is another repository in this project for the Import Tenant Data App. This app has an Azure Blob Connect functionality to utilize the Azure Blob app for data import.
I start by opening the Import Data Source page.
Here I find the Azure Blob Connector that self registered in the Import Data Source table.
I need to go to Process -> Setup to configure my Azure Blob container access.
The information required can be found in the Azure Portal.
Specify the container where you have uploaded all the Xml files.
Then I searched for Import Project List and create a new import project for the General Ledger. The Import Source for Azure Blob was automatically select, since that is the only one available.
Now to import the related Xml files into this project
I get a list of files from the Azure Blob and select the one I need.
The file list will open again if I have more files to import. Close the file list when finished. Back on the Import Project we should now see information from the Xml file.
For each file I need to configure the destination mapping.
If the table exists in my Business Central App then it will be automatically selected.
And I can map fields from the Xml file to the Business Central Table.
There are options to handle different data structure. One is that we can add a transformation rule directly to each field. The other one is using our own custom data upgrade app that subscribes to the events published in this app.
Four events are published, two for each field in the mapping, two before updating or inserting the database record.
Based on the information in the publishers we can do any manual data modification required. In my example the creation time was added to each G/L Entry in NAV, but is added to the G/L Register in Business Central.
From the list of tables we are able to start the data transfer. First we need to make sure that we have the correct configuration for the import. Do we want to commit during the import, do we want to create missing records in our database?
I select to commit after each 1000 records. If my data transfer stops, than I can resume from that position when I start the data transfer again.
We have the option to create a task in the job queue to handle the data transfer.
The job queue can handle multiple concurrent transfers so the import should not take to much time. Looking into the Destination Mapping, we can see the status of the data import.
I will add few more pictures to give you a better idea of what can be done with this import tenant data app. The AL code is in GitHub for you to browse, improve and fix.