Showing posts with label Data Import. Show all posts
Showing posts with label Data Import. Show all posts

Data Task Automation In Simple Steps

  

Data Task Automation

Data task automation lets you easily repeat many types of data tasks and validate the outcome of each task. Data task automation is very useful for projects that are in the implementation phase. For example, you can automate the creation and configuration of data projects

 

Scenario to Move legal entity from golden server to Prod.

One of my ex-company customers was already live with different legal entities and every time they spent a lot of time configuring the company on prod or any other environment manually.

 

So I gave them a suggestion to use Data Task automation, the use of this feature is very easy, you only need to identify the data entities and their execution sequence at once and then you can execute the same package for every environment to configure the new legal entity.

Instructions

 

Identify the Data entities.

Identify their execution sequence

Data packages

Manifest configuration

Package Execution on the Source environment

Package Data Cleansing

Package Execution on Target environment

 

 

Identify the Data Entities

            We will begin to identify all the data entities that are involved in our all OOTB/Custom setups configuration and incorporate the missing fields (if any) and create the new data entities as per your requirement.

Identify their execution sequence

            You need to make sure the sequence of the data entity execution is correct otherwise you have the data dependency issue during the execution of the package.

Data packages

On your developer server navigate to the DMF and create the export data source packages, it is completely up to your approach, you can create a single package with all data entities or can segregate them into different packages module wise or as you want.

Download the packages you created recently and upload them Over LCS>Assets Library>Data Packages section.

Manifest configuration

The next the step is to configure the manifest, in this article we will create two manifests one for export the data and 2nd to import the data.

Sample manifest

<?xml version='1.0' encoding='utf-8'?>

<TestManifest name='Data management demo data set up'>

    <SharedSetup />

        <JobDefinition ID='ImportJobDefinition_1' />

        <EntitySetup ID='Generic' />

    </SharedSetup>

    <TestGroup />

</TestManifest>

 

Manifest Structure

Shared setup

The shared setup section defines general task parameters and behaviors for all tasks in the manifest.

Data files

Here we define the packages information, which we create earlier and uploaded to the LCS project assets library, we can the data package from Shared asset library.

Project Asset library XML sample              

<DataFile ID='SharedLibrary' name='Demo data-7.3-100-System and Shared'  assetType='Data package' lcsProjectId=''/>

Shared Asset library XML sample

<DataFile ID=ProjectSetup name='Your Package Name Shared'  assetType='Data package' lcsProjectId='X1234654'/>

LCS Project Id is the only difference that gives the command to Data Task Automation the process to pick the file either from the shared Library or from the project library

 

We can have multiple data files in the same manifest, which can target the different packages.

 

 

 

 

Job Definition

Sample XML

<JobDefinition ID='ImportJobDefinition_1'>

    <Operation>Import</Operation>

    <ConfigurationOnly>No</ConfigurationOnly>

    <Truncate></Truncate>

    <Mode>Import async</Mode>

    <BatchFrequencyInMinutes>1</BatchFrequencyInMinutes>

    <NumberOfTimesToRunBatch >2</NumberOfTimesToRunBatch>

    <UploadFrequencyInSeconds>1</UploadFrequencyInSeconds>

    <TotalNumberOfTimesToUploadFile>1</TotalNumberOfTimesToUploadFile>

    <SupportedDataSourceType>Package</SupportedDataSourceType>

    <ProcessMessagesInOrder>No</ProcessMessagesInOrder>

    <PreventUploadWhenZeroRecords>No</PreventUploadWhenZeroRecords>

    <UseCompanyFromMessage>Yes</UseCompanyFromMessage>

    <LegalEntity>DAT</LegalEntity>

    <PackageAPIExecute>true</PackageAPIExecute>

    <PackageAPIOverwrite>false</PackageAPIOverwrite>

    <PackageAPIReexecute>false</PackageAPIReexecute>

    <DefinitionGroupID>TestExport</DefinitionGroupID>

    <PackageName>TestExportPackage</PackageName>

</JobDefinition>

 

Job Definition ID

You can write any unique name here as Job Definition Id.

 

Operation

Here we define the operation type, either import or export.

 

Configuration Only

It’s means you need to configure or want to configure + execute. If you want to configure + execution then set value “NO” or if you want to configure the project only then set value  “Yes”

 

Truncate

This fields always set as “NO”.

 

 

 

Test Group

We can have multiple test group in the same manifest file.

Sample XML

<TestGroup name='Set up Financials'>

    <TestCase Title='Import shared set up data package' ID='3933885' RepeatCount='1' TraceParser='off' TimeOut='20'>

        <DataFile RefID= ‘ProjectSetup’ />

        <JobDefinition RefID='ImportJobDefinition_1' />

        <EntitySetup RefID='Generic' />

    </TestCase> </TestGroup>

Name:

Here you write any unique name of the test group

Title:

The data-task automation process will use the name to create the data packages during execution.

Repeat Count:

How many times you want to process this section in one execution.

Time-Out:

Here we mentioned the maximum time package will wait and execute the package.

 

Package Execution on the Source environment

 

Once you completed with your export manifest, upload it on the manifest on the source environment, the data task automation will create some tasks and use the data packages you uploaded earlier on LCS to create the data export package on your source server.

 

In my scenario, I have created and uploaded these packages on LCS.

Packages:

·        Package_AccountPayable

·        Package_AccountReceivable

·        Package_CashBank

·        Package_CreditCollection

·        Package_ExpenseManagement

·        Package_FixedAssets

·        Package_GeneralJournal

·        Package_Inventory

·        Package_ProcurmentSourcing

·        Package_ProjectManagement

·        Package_SysAdmin

·        Package_TaxAuthority

 

 

Reference Screenshot:

 

Generate Data Source

In order to generate the data source files, we need to upload our export manifest file in the Data Task Automation framework.

Pre-Requisite

Before uploading the manifest file to start the process, please go to the number sequence screen and enable the property to allow the user “changes” from “NO” to “YES”.

Please make sure after completion of the activity revert back the changes from both environments.

Reference screenshot:


Steps to Generate Data Source

Steps-1:

Login to your source machine and navigate to Data management workspace, then click on the Data Task automation tile.

Reference screenshot:



 

Step-2:

Upload the Manifest file on the source machine.

Reference screenshot:


 

After 3 or 4 seconds, 12 separated Export tasks will appear on your screen, and if it’s not appearing itself then refresh the screen.

Reference screenshot:



Step-3:

Execute the task either one by one or you can execute all of them at once. The application will follow the defined sequence and Generate and start exporting the data following the given sequence instruction.

The overall export will take 20 to 30 minutes. (Depends on the volume of the data).

Reference Screenshot:



 

Step-4:

Please download all these packages and add the prefix of the legal entity you are planning to move from source to destination.



Data Cleansing

Here we need to perform some manual data cleansing because some of the entities export the cross-company information.

Step-5:

Remove the other entities' information from the files one by one (if any), otherwise, data task automation overrides the state of other entities' information, which may create problems for us. So, it’s better to perform data cleansing on file level.

In my case, I performed the data cleansing from the following files.

·        Account structure activation

·        Account structure allowed values

·        Account structures

·        Accounts for automatic transactions

·        Chart of accounts

·        Fiscal calendar integration entity

·        Fiscal calendar period

·        Fiscal calendar

·        Ledger fiscal calendar period

·        Ledger fiscal calendar year

·        Ledger settlement accounts

·        Ledger

·        Main account

·        Number sequence code V2

·        Number sequence references V2

Data Import Steps

Step-6:

Set the prefix of your legal entity name against every folder you download in step 4.



 

Step-7:

Now Upload the data packages on LCS like we did in the start

Step-8:

Zip the folder again and upload it to LCS.



 

Step-9:

Now login to you Target the environment and upload the manifest after performing little change in the XML-like name changing at Data File section.

Before Change



After change



 

 

Before change



After change



 

Step-12

Navigate to the Data task management screen of the destination server and Upload the Manifest a file like we did earlier for export.

All the tasks will appear on your screen in between 2 or 3 seconds or you can refresh the screen manually.

The sequence of execution In my Case

Please execute the task using the following sequence.

General Journal Module 
Cash & Bank
Tax Authority
Expense & Management
Fixed Assets

Product Information (As product hierarchy and Product categories are global level setups, so we don’t need to execute this step.) We can release the product using the OOTB job which will work faster than the import process.

Credit & Collection
Inventory
Account Receivable
Account Payable
Project Management
System Administration

 

Important

If any new Financial dimension introduces, then will suggest creating them manually on the target systems and activate them before start execution of the packages.


Support Faryal's Cusine


Skip/Bypass validation in Data Entity Import – D365FO

 


Skip/Bypass validation in Data Entity Import – D365FO


I had a scenario where I need to create the Tax Exempt Number dynamically when importing Vendors' data.

For instance, if the given VatNum does not exist in the input data, the system should bypass the validation and create the vendor without any error. 


So, how do we incorporate this validation? Using the COC method of persistEntity.

public void persistEntity(DataEntityRuntimeContext _entityCtx)
{   
    
        next persistEntity(_entityCtx);
     this.skipDataSourceValidateField(fieldNum(VendVendorsV2,
     VatNum),true);
    
}

Support Faryal's Cusine


Virtual Fields Vs Computed Fields

  Virtual Field: A virtual field in D365FO is a field that doesn't have a direct representation in the database. It's a field that y...