Data Management Project Using X++
Multi-thread Imports in Dynamics 365 for Finance & Operations
Multi-thread Imports in Dynamics 365 for Finance & Operations
There are a
few perplexities on how to multi-thread your imports in Dynamics 365 for
Finance and Operations. The primary thing to know is that Microsoft prevents
you from multi-threading a few entities, which may be a great thing. Usually
great since in Flow AX 2012 you'll multi-thread any substance and
fundamentally, in case the arrange of the records getting imported is critical,
at that point, multi-threading isn’t a great alternative since in case they go
out of grouping due to records getting imported in parallel you'll degenerate
your data.
The moment the thing to know is how to set up multi-threading in Dynamics 365FO. To do this
you essentially go to Workspaces > Information Administration > Tap
the System Parameters tile > Substance Settings tab > Press Arrange Substance
Execution Parameters.
Within the
entity execution parameters, you're defining how numerous strings ought to be
utilized after you purport a substance in the group. You wish to get it merely can
set up a substance at different times in this frame as seen below.
You're
defining three things within the form:
Entity – What substance are you setting
multi-threading up for.
Import threshold– The threshold tells the framework
how numerous records got to be imported to utilize this line.
Import
record count – How
numerous strings ought to be utilized; aka how numerous assignments are
created. Example Let’s walk through an illustration of bringing in the
“Customer Definition” substance with 1360 records.
Next Actions
Create Import project, add the entity you added earlier in the execution parameters form, and import in the batch mode.
To verify multi threading go to the Batch job screen, find your job and check in the view task screen, you will find multiple threads of your task.
Support Faryal's Cusine
Data Task Automation In Simple Steps
Data Task Automation
Data task automation lets you easily repeat many types of
data tasks and validate the outcome of each task. Data task automation is very
useful for projects that are in the implementation phase. For example, you can
automate the creation and configuration of data projects
Scenario to Move legal entity from golden
server to Prod.
One of my ex-company customers was already live with
different legal entities and every time they spent a lot of time configuring the company on prod or any other environment manually.
So I gave them a suggestion to use Data Task automation, the
use of this feature is very easy, you only need to identify the data entities
and their execution sequence at once and then you can execute the same package
for every environment to configure the new legal entity.
Instructions
Identify the Data entities.
Identify their execution sequence
Data packages
Manifest configuration
Package Execution on the Source environment
Package Data Cleansing
Package Execution on Target environment
Identify the Data Entities
We will begin to identify all the
data entities that are involved in our all OOTB/Custom setups configuration and
incorporate the missing fields (if any) and create the new data entities as per
your requirement.
Identify their execution sequence
You need to make sure the sequence
of the data entity execution is correct otherwise you have the data dependency issue
during the execution of the package.
Data packages
On your
developer server navigate to the DMF and create the export data source packages,
it is completely up to your approach, you can create a single package with all
data entities or can segregate them into different packages module wise or as
you want.
Download the
packages you created recently and upload them Over LCS>Assets Library>Data
Packages section.
Manifest configuration
The next the step is to configure the manifest, in this article we will create two manifests one
for export the data and 2nd to import the data.
Sample manifest
<?xml version='1.0' encoding='utf-8'?>
<TestManifest name='Data
management demo data set up'>
<SharedSetup
/>
<JobDefinition
ID='ImportJobDefinition_1' />
<EntitySetup
ID='Generic' />
</SharedSetup>
<TestGroup
/>
</TestManifest>
Manifest Structure
Shared setup
The shared setup section defines general task parameters and behaviors for all
tasks in the manifest.
Data files
Here
we define the packages information, which we create earlier and uploaded to the
LCS project assets library, we can the data package from Shared asset library.
Project Asset library XML sample
<DataFile ID='SharedLibrary' name='Demo data-7.3-100-System and Shared' assetType='Data package' lcsProjectId=''/>
Shared Asset library XML sample
<DataFile ID=ProjectSetup name='Your Package Name Shared' assetType='Data package' lcsProjectId='X1234654'/>
LCS
Project Id is the only difference that gives the command to Data Task Automation the process to pick the file either from the shared Library or from the project library
We
can have multiple data files in the same manifest, which can target the
different packages.
Job Definition
Sample XML
<JobDefinition ID='ImportJobDefinition_1'>
<Operation>Import</Operation>
<ConfigurationOnly>No</ConfigurationOnly>
<Truncate></Truncate>
<Mode>Import
async</Mode>
<BatchFrequencyInMinutes>1</BatchFrequencyInMinutes>
<NumberOfTimesToRunBatch
>2</NumberOfTimesToRunBatch>
<UploadFrequencyInSeconds>1</UploadFrequencyInSeconds>
<TotalNumberOfTimesToUploadFile>1</TotalNumberOfTimesToUploadFile>
<SupportedDataSourceType>Package</SupportedDataSourceType>
<ProcessMessagesInOrder>No</ProcessMessagesInOrder>
<PreventUploadWhenZeroRecords>No</PreventUploadWhenZeroRecords>
<UseCompanyFromMessage>Yes</UseCompanyFromMessage>
<LegalEntity>DAT</LegalEntity>
<PackageAPIExecute>true</PackageAPIExecute>
<PackageAPIOverwrite>false</PackageAPIOverwrite>
<PackageAPIReexecute>false</PackageAPIReexecute>
<DefinitionGroupID>TestExport</DefinitionGroupID>
<PackageName>TestExportPackage</PackageName>
</JobDefinition>
Job Definition
ID
You can write
any unique name here as Job Definition Id.
Operation
Here we
define the operation type, either import or export.
Configuration
Only
It’s
means you need to configure or want to configure + execute. If you want to
configure + execution then set value “NO” or if you want to configure the project only then set value “Yes”
Truncate
This fields
always set as “NO”.
Test Group
We can have
multiple test group in the same manifest file.
Sample XML
<TestGroup name='Set
up Financials'>
<TestCase
Title='Import shared set up data
package' ID='3933885' RepeatCount='1' TraceParser='off' TimeOut='20'>
<DataFile
RefID=
‘ProjectSetup’ />
<JobDefinition
RefID='ImportJobDefinition_1' />
<EntitySetup
RefID='Generic' />
</TestCase> </TestGroup>
Name:
Here you
write any unique name of the test group
Title:
The data-task automation process will use the name to create the data packages during
execution.
Repeat
Count:
How many times
you want to process this section in one execution.
Time-Out:
Here we
mentioned the maximum time package will wait and execute the package.
Package Execution on the Source environment
Once you
completed with your export manifest, upload it on the manifest on the source environment,
the data task automation will create some tasks and use the data packages you
uploaded earlier on LCS to create the data export package on your source
server.
In my scenario,
I have created and uploaded these packages on LCS.
Packages:
·
Package_AccountPayable
·
Package_AccountReceivable
·
Package_CashBank
·
Package_CreditCollection
·
Package_ExpenseManagement
·
Package_FixedAssets
·
Package_GeneralJournal
·
Package_Inventory
·
Package_ProcurmentSourcing
·
Package_ProjectManagement
·
Package_SysAdmin
·
Package_TaxAuthority
Reference
Screenshot:
Generate Data Source
In order to generate the data source files, we need to
upload our export manifest file in the Data Task Automation framework.
Pre-Requisite
Before uploading the manifest file to start the process,
please go to the number sequence screen and enable the property to allow the user
“changes” from “NO” to “YES”.
Please make sure after completion of the activity revert back the changes from both environments.
Reference screenshot:
Steps to Generate Data Source
Steps-1:
Login to your source machine and navigate to Data
management workspace, then click on the Data Task automation tile.
Reference screenshot:
Step-2:
Upload the Manifest file on the source machine.
Reference screenshot:
After 3 or 4 seconds, 12 separated Export tasks will
appear on your screen, and if it’s not appearing itself then refresh the
screen.
Reference screenshot:
Step-3:
Execute the task either one by one or you can
execute all of them at once. The application will follow the defined sequence and
Generate and start exporting the data following the given sequence instruction.
The overall export will take 20 to 30 minutes. (Depends on
the volume of the data).
Reference Screenshot:
Step-4:
Please download all these packages and add the prefix of
the legal entity you are planning to move from source to destination.
Data Cleansing
Here we need to perform some manual data cleansing
because some of the entities export the cross-company information.
Step-5:
Remove the other entities' information from the files one
by one (if any), otherwise, data task automation overrides the state of other
entities' information, which may create problems for us. So, it’s better to
perform data cleansing on file level.
In my case, I performed the data cleansing from the
following files.
·
Account structure activation
·
Account structure allowed values
·
Account structures
·
Accounts for automatic transactions
·
Chart of accounts
·
Fiscal calendar integration entity
·
Fiscal calendar period
·
Fiscal calendar
·
Ledger fiscal calendar period
·
Ledger fiscal calendar year
·
Ledger settlement accounts
·
Ledger
·
Main account
·
Number sequence code V2
·
Number sequence references V2
Data Import Steps
Step-6:
Set the prefix of your legal entity name against every folder you download in step 4.
Step-7:
Now Upload the data packages on LCS like we did in the
start
Step-8:
Zip the folder again and upload it to LCS.
Step-9:
Now login to
you Target the environment and upload the manifest after performing little change in
the XML-like name changing at Data File section.
Before Change
After change
Before change
After change
Step-12
Navigate to
the Data task management screen of the destination server and Upload the Manifest a file like we did earlier for export.
All the tasks will appear on your screen in between 2 or
3 seconds or you can refresh the screen manually.
The sequence of execution In my Case
Please
execute the task using the following sequence.
General Journal Module
Cash & Bank
Tax Authority
Expense & Management
Fixed Assets
Product Information (As product hierarchy and
Product categories are global level setups, so we don’t need to execute this
step.) We can release the product using the OOTB job which will work faster than
the import process.
Credit & Collection
Inventory
Account Receivable
Account Payable
Project Management
System Administration
Important
If any new
Financial dimension introduces, then will suggest creating them manually on the
target systems and activate them before start execution of the packages.
Support Faryal's Cusine
Virtual Fields Vs Computed Fields
Virtual Field: A virtual field in D365FO is a field that doesn't have a direct representation in the database. It's a field that y...
-
Release Product Using X++ You can use the below code to release the product to any legal entity using X++ or a custom batch job. Sample Co...
-
How To Access Field In A Table Using This Field Id To do this you could use the next construction: vendTable.( filednum( vendTable, VendAc...
-
What is the Difference? Refresh refresh will not reread the record from the database. It basically just refreshes the screen with whateve...