Data lake Dynamics 365 Finance & Operations

 

D365FO Data lake with Azure Data Factory


If you are planning to implement the Azure data lake feature in D365FO, and do not want to use the azure synapse for any reason, then this post will give you a quick start in implementing the solution. 

The solution does have the capabilities to make full and incremental load with multi-threading using a modular approach.

In the future, you can use the same ADF solution in Azure synapse.

In the following solution, we are loading the data from the Azure data lake to the Azure SQL (Physically) using the Azure Data Factory, We make customization to generate the table's constraint to make sure that we will have the same performance at Azure SQL.





Following are the step that helps you to incorporate the provided solution in your implementation.

The solution has 3 parts including its source code.

  • D365FO 
  • SQL Server 
  • Azure Data Factory 

D365FO


As a first step, you need to configure the Azure data lake with Dynamics 365 Finance & Operations Link.

Once the data lake is configured successfully, then please import our customization package, you can download the file using this link, and additional link


After successful Import, please create the menu item for [AVIBlobStorageAccount] form, and set the navigation of the form according to your easiness.

As a pre-requisite, please navigate to that form, and provide the following information.
  1. DataLake Storage Account Name.
  2. Account key for access.
  3. Click on data lake to enable the checkbox.

Reference screenshot



All the above information will be utilized by our customization to generate and upload the file to the Azure Data Lake container.

Once all the prerequisites are completed, then activate your required tables.

Reference screenshot. 

  







Once all tables are in a running state then select all of them, and click on the highlighted menu item.

This is the customization I have done to fulfill my requirements to have the schema of a complete table including primary keys and indexes. 

The second step of the activity will generate the CSV file that will be used in Azure Data Factory and will have the complete folder hierarchy path against every table 

Reference screenshot





Before execution of our customization, you will see the following OOTB folders in your data lake. Please execute the customization in batch mode.

Reference screenshot
After execution of our customization, You will see two additional folders.

  • SQLScript.
  • EntityMapping
Reference screenshot.

SQLScript

As I mentioned earlier that the folder contains the SQL table schema that will be used by the Azure data factory to create the tables on runtime if doesn't exist.

File preview




EntityMapping


The entity mapping file will also be used by the Azure data factory for reading the CSV file using a common data model connector.

File preview




Now the Dynamics 365 Finance & Operations side activities are completed.

Let's jump to the next step.

SQL Server 

At the SQL server-side, we have only the following 3 steps.
  • Create Azure SQL database
  • Create a new Schema with the Name of [DL], as it is being used in our complete solution.
  • Create the following Store procedure in your database. You can download the store procedure using this link.
Important
  • As we are using Data flow with a Common data connector that doesn't support the self-hosted integration runtime, so please use the Azure integration runtime.
  • Some Dimension-related tables have an index on the Hash key, and Microsoft doesn't export the Hash information, so you need to disable those indexes manually otherwise may face the index violation issue.

I would recommend having a look at the store procedure for technical understanding, You can modify them according to your requirements.

Azure Data Factory 

I would recommend having a look at the prerequisite and completing them before importing our Azure Data Factory source code as a template by using this link.

Pre-Requisite


Linked Services.

Create two linked services in your Azure data factory. One for Azure SQL communication, and the Second for Azure data lake communication via Azure Data Factory.

Now, You can import the Azure Data factory template to your data factory.

Process Flow Diagram

Below is the Azure data factory flow diagram that will be used in both types of data load (Full & incremental)


Steps

In the first step, the pipeline sets the current execution time into a variable.
In the second step, a separate activity "SequencePipeLine" is called within the main activity.

SequencePipeLine

    In the sequence pipeline activity, the azure data factory loads the file of sequences from the data lake (from the SQLScript folder) that is being used in all schema tables and will generate them into the SQL database.



In the third step, another separate activity "SQLDatabaseSchema" is called within the main activity.

SQLDatabaseSchema

           In the Database schema pipeline activity, the azure data factory loads the files of tables schema from the data lake (from the SQLScript folder) and will create the table, keys, and indexes if not exists database.

         The [EntitiesDictionary] part is calling a store procedure that will create two additional tables in your database and load one of them with all the table's names and with the default last sync date 01-01-1900.

        The Last part of this pipeline will load the manifest parameter file from the Azure data lake to your database that we generated during D365FO activities.




In the next step, the main activity executing two parallel pipelines [FullLoadPipeLine] & [IncrementalPipeLine].

FullLoadPipeLine & IncrementalPipeLine

In both pipelines, three parallel activities are called, and all of them are calling the store procedure based on different parameter values and processing the different sets of tables.




A data flow activity is called in every for each loop activity, the data flow has a few parameters like below that need to provide, and all this information you will get as an object in your for-each loop's iteration.

after the data flow, the next step is to mark the last execution in our directory table.





Let's discuss the data flow here.

We have two separate data flows for full load and for incremental.

In both data flows, we have two additional steps for transformation, derived column, and selection of records, but in the incremental data flow, there are 3 additional steps to manage the change tracking and avoid the duplication of the records. 

I would suggest having a look at both data flow for a better understanding. 

Reference screenshot of full load data flow



Reference screenshot of incremental load data flow





Important

In this complete solution of the Azure data factory, the following features are not included.

  • Retry logic 
    • The retry logic is important because you can have a connectivity issue for any technical reason.
  • Email notification 

Downloadable links



Please feel free to connect with me If you are having an issue during the implementation of the solution, or if you want to highlight the improvement.

Support Faryal's Cusine


Default User for SharePoint Document Management

 

 SharePoint Document Management 



Problem statement

In this article, we will cover implementing default users to upload the document on the SharePoint repository.

In the OOTB Dynamics 365 Finance & Operations, You need to provide the uploading rights to every user over the SharePoint folder, so he/she can upload the documents to the SharePoint.

What if your documents are secure and you want to hide one user's documents from another one?

You can implement the XDS security on the D365FO end, but what about SharePoint? if the user can directly log in to SharePoint then he/she can see the document of all other users.

Solution

In D365FO, I introduce the feature to set the default user to upload the document to the SharePoint repository on the behalf of all D365FO Users.

At the end of the article, you will find the github link to download the project.

Reference Screenshot


Before jumping directly to the solution, first, we need to understand, how the application is making the proxy connection?


D365FO makes the proxy connection using your external identity, and external identity is populated to the record only when the user login on the D365FO at least once.

What are the classes and forms D365FO is using in uploading the documents to SharePoint?

Following is the form and the classes that are in use of Dynamics 365 Finance & Operations to upload the documents at SharePoint

Form
  • DocuParameters
Classes
  • DocGlobalHelper
  • ExportToExcelStorageHelper
  • OfficeSharePointFolderSelection
  • Docu
  • CollabsitesharepointonlineProxy
  • CollabsiteSPonlineTemplateManager
  • DocDocuActionFileWithMetadata
  • DocSharePointHelper
  • DocSharePointHelperV2

What are the Implementation steps?

First, download the solution from this Download link.
Import the project into your solution.
The solution contains the following elements.
Form
  • Extension of DocuParameters
  • Extension of DocuView
Table
  • Extension of DocuParameter(Table)
Classes
  • Extension of xUserInfo 
  • Extension of DocGlobalHelper
  • Extension of DocuValue
  • New File - DownlaodSharePointFile
Menu Item
  • Download SharePoint
Now perform Build and Sync.
Now create a separate user for document management ex. D365SharePointDoc@Yourdomain.com
Create the SharePoint folder hierarchy for document management.
Grant the permission and rights to the newly created user over SharePoint and target folder.
Revoke the rights of other users from the SharePoint document folder.
Set the user as default to the document management parameter form, reference image available on Top.
Check the SharePoint connection.

When the user will upload the document at SharePoint, you will see the name of your default user name.

Reference screenshot




But on the database level, you will see the name of the user who uploaded the document.

Reference screenshot





Important 
If you are using any custom solution for attachment and uploading the document, then perform the proper testing at your end.

I have developed and tested the solution on 10.0.24.

Support Faryal's Cusine


File attachment virus security

 

File attachment virus security Dynamics 365 Finance & Operations

 

This is a normal scenario in every implementation that Business users are allowed to attach any documents with any records using the existing attachment feature.

 

What about if a Business user uploads intentionally or unintentionally the virus-affected file or virus in the shape of any document.

It could make a heavy impact, If the other use opens the affected attached file or virus, and if your network of the machine is not properly secure with antivirus.

 

How you can protect to reduce the chances to get impacted by attachment?

Options

  • 1.      Limited file types should be allowed.
  • 2.      Scan the document while uploading.

 

 

Limited document types should be allowed to upload or attach.

Go to

Organization administration > Document management > Document management Parameter > File type

Please do not allow the "EXE" extension type files to upload or attach.



Scanning uploaded files for viruses and malicious code

 

The second option is scanning the file as a pre-requisite of file uploading.

There is already a delegate available in Dynamics 365 Finance & Operations from 10.0.12, which users can subscribe to and can implement their own file scanning security.


You can subscribe to any online antivirus services and can implement your logic in the below code.


public final class ScanDocuments

    {

        [SubscribesTo(classStr(FileUploadResultBase), staticDelegateStr(FileUploadResultBase, delegateScanStream))]

        public static void FileUploadResultBase_delegateScanStream(System.IO.Stream _stream, EventHandlerRejectResult _validationResult)

        {

            if (!ScanDocuments::scanStream(_stream))

            {

                _validationResult.reject();          

            }

        }

        private static boolean scanStream(System.IO.Stream _stream)

        {

            /*

            Custom implementation is required for connecting to a scanning service

            If the document scanning process found an issue, return false; otherwise, return true;

            */

            return true;

        }

    }

Support Faryal's Cusine


RecordInsertList Dynamics 365 Finance & Operations

 

RecordInsertList Dynamics 365 Finance & Operations



A shortcode example of how to can create records in a table very performant way using RecordInsertList. Who does not know RecordInsertList, can learn more about it here.

static void HowToUseRecordInsertList(Args _args)
{
    DMOPerfTest DMOPerfTest;
    RecordInsertList RecordInsertList;
    Counter c;
    FromTime fromTime = timeNow();
    
    RecordInsertList = new RecordInsertList(tableNum(DMOPerfTest));
    
    for (c=1;c<=10000;c++)
    {
        DMOPerfTest.clear();    
        DMOPerfTest.AccountNum = int2str(c);
        
        if(DMOPerfTest.validateWrite())
        {
            RecordInsertList.add(DMOPerfTest);
        }
    }
    
    RecordInsertList.insertDatabase();
    
    info(strFmt("Total time consumed: %1", timeConsumed(fromTime, timeNow())));
}

Support Faryal's Cusine


Data Management Project Using X++

 

Data Management Project Using X++


Using the following code, you can import the file and execute the project using your x++ code.

In the code, I am making the file on runtime and uploading the file into temp storage for the demo purpose
but you can use the file uploader as per your requirement.


Code

class AD_Test
{
    /// <summary>
    /// Runs the class with the specified arguments.
    /// </summary>
    /// <param name = "_args">The specified arguments.</param>
    public static void main(Args _args)
    {
        FileUploadTemporaryStorageStrategy fileUploadStrategyInstance;
        DMFDefinitionGroupEntity    definitionGroupEntityBuffer;
        DMFEntity                   dmfEntitybuffer;
        FileUploadTemporaryStorageResult result;
        SysDictClass uploadStrategyDictClassInstance;
        str entityName = 'VendVendorGroupEntity';
        System.IO.StreamWriter swriter;
        System.IO.MemoryStream   fileStream;
        DMFExecutionId executionId;
        #File



        fileStream = new System.IO.MemoryStream();
        swriter = new System.IO.StreamWriter(fileStream);
        swriter.WriteLine("VENDORGROUPID,DESCRIPTION");
        swriter.WriteLine(strFmt("%1,%2",'Ex-VendGroup','Example Vendor group'));
        swriter.Flush();

        uploadStrategyDictClassInstance = new SysDictClass(className2Id('FileUploadTemporaryStorageStrategy'));
        fileUploadStrategyInstance = uploadStrategyDictClassInstance.makeObject() as FileUploadTemporaryStorageStrategy;
        result = fileUploadStrategyInstance.uploadFile(fileStream, 'VendorsGroups.csv');
     
        fileStream.Dispose();

        CodeAccessPermission::revertAssert();

        executionId = DMFUtil::setupNewExecution('ImportVendorGroup');

        select firstonly  Entity from definitionGroupEntityBuffer exists join dmfEntitybuffer
            where definitionGroupEntityBuffer.DefinitionGroup == 'ImportVendorGroup' &&
                dmfEntitybuffer.EntityName == definitionGroupEntityBuffer.Entity &&
            dmfEntitybuffer.TargetEntity == entityName ;

       
        DMFDefinitionGroupExecution executionBuffer = DMFDefinitionGroupExecution::find('ImportVendorGroup', definitionGroupEntityBuffer.Entity, executionId, true);
        ttsbegin;
        executionBuffer.FilePath = result.getFileId();
        executionBuffer.IsTransformed = NoYes::Yes;
        executionBuffer.update();
        ttscommit;

        DMFQuickImportExport::doPGImport('ImportVendorGroup', executionId, true);
    }

}

Support Faryal's Cusine


Existing Index Changes D365FO

 

Change existing Index D365FO

 

As we all have a requirement in the past to perform the changes in the existing indexes, and as we know the customization is not allowed in the OOTB elements, so we couldn’t achieve this.

 

How we can achieve it?

 

In D365FO, we couldn’t delete or disable the existing index, but there is one way to achieve this.

We can disable the index on SQL level by using the SQL script even on prod.

 

Yes, you are reading correct, we can disable the index and it will not enable again during package deployment, please only disable the index not remove the index, otherwise, Microsoft classes will re-create the index during deployment.

 Now you can create the new index as per your requirements and it will work properly and will not make any conflict with the existing index, because it is already in the disabled state.

How you can disable the index?

 

  public void disableIndex(str _tableName, str _indexName)

    {

        str sql;

        Connection conn;

        SqlStatementExecutePermission permission;

        ;

       

            sql= strFmt("ALTER INDEX %1 ON %2 DISABLE;", _ indexName, _tableName);

            // sql = ‘delete from custTable’;

            permission = new SqlStatementExecutePermission(sql);

            conn = new Connection();

            permission = new SqlStatementExecutePermission(sql);

            permission.assert();

            conn.createStatement().executeUpdate(sql);

            // the permissions needs to be reverted back to original condition.

            CodeAccessPermission::revertAssert();

       

    }


Important

Before applying the index, please perform the analysis on your existing data, might possible

you can face the issue.

Support Faryal's Cusine


Mandatory Field with user Personalization


Mandatory Field with user Personalization

This feature is available in Version 10.0.12, and you can enable this feature from feature management.



Many times we receive requests from the customers that for their internal control purpose they want to make few fields mandatory in the particular form.

Earlier we had to connect with the technical team and make it mandatory but now it's not required. End-user itself can make the field mandatory base on requirements.


Here are the steps you can follow to achieve this requirement.

Ste-1

Go to the feature management and enable the feature like below.


Step-2 

Go to the form, on which you want to make the field Mandatory.



Step-3 Right Click > Personalize Field > Select required Check box option




Step-4 

You will get the message like below when will you try to save the record without entering the value in the field.






Support Faryal's Cusine


Virtual Fields Vs Computed Fields

  Virtual Field: A virtual field in D365FO is a field that doesn't have a direct representation in the database. It's a field that y...