Showing posts with label D365FO. Show all posts
Showing posts with label D365FO. Show all posts

Pass By Reference and Pass By Value

 

Difference Between Pass By Reference And Pass By Value?


Pass By Reference: 

In Pass by reference address of the variable is passed to a function. Whatever changes made to the formal parameter will affect the actual parameters

  1. The same memory location is used for both variables. (Formal and Actual)
  2. It is useful when you are required to return more than 1 value.

Pass By Value:

  1. In this method value of the variable is passed. Changes made to formal will not affect the actual parameters.
  2. Different memory locations will be created for both variables.
  3. Here there will be a temporary variable created in the function stack that does not affect the original variable.

In the case of pass-by value, the change in the sub-function will not cause any change in the main function whereas in pass-by-reference the change in the sub-function will change the value in the main function.

Pass by value sends a COPY of the data stored in the variable you specify, pass by reference sends a direct link to the variable itself. So if you pass a variable by reference and then change the variable inside the block you passed it into, the original variable will be changed. If you simply pass by the value, the original variable will not be able to be changed by the block you passed it into but you will get a copy of whatever it contained at the time of the call.

What are AOS in AX & D365FO

 

AOS in AX & D365FO


The Microsoft Dynamics AX / D365FO (Finance & Operations) Object Server (AOS) is the second-tier application server in the Microsoft Dynamics AX three-tier architecture.

The AX 2012 3-tier environment is divided as follows:

  1. First Tier – Intelligent Client
  2. Second Tier – AOS
  3. Third Tier – Database Server

In D365FO N-tier environment is divided as follows:

  1. X-Tier – AOS, and Batch servers
  2. X-Tier – Database Server 

In an N-tier solution, the database runs on a server as the X tier; the AOS handles the business logic, Batch jobs/ background process in multiple tiers, and handles the user interface and necessary program logic.

What is AOT in AX & D365FO

 


Define AOT in AX & D365FO



The Application Object Tree (AOT) is a tree view of all the application objects within Microsoft Dynamics AX and D365FO. The AOT contains everything you need to customize the look and functionality of a Microsoft Dynamics AX application.

In the D365FO,  You can find the AOT objects in Application Explorer.

Virtual Company

 

Why We Use Virtual Companies?



Virtual company accounts contain data in certain tables that are shared by any number of company accounts. This allows users to post information in one company that will be available to another company.

D365FO Interview Question

 

D365FO Interview Questions & Answers









What is Microsoft Dynamics AX?

Answer: Link

What is the difference between Microsoft Dynamics AX2012 & D365FO?

Answer: Link

Difference Between Edit And Display Method?

Answer: Link

What Is An Index?

Answer: Link

What are OOPS concepts?

Answer: Following are the main concept of OOPS:

  • Data Abstraction: Showing only the essential information and hide background details.
  • Encapsulation: Wrapping of data member and method to a single unit.
  • Inheritance:-The Flowing of property of parent class to the child class.
  • Polymorphism:-The property of using the same method again and again to perform different things.

Differentiate Refresh (), Reread (), Research (), Executequery ()?

Answer: Link

Why We Use Virtual Companies?

Answer: Link

What is AOT in AX & D365FO?

Answer: Link

What is AOS in AX & D365FO?

Answer: Link

Interface VS Abstract?

Answer: Link

How many types of key tables has?

Answer: Link
  • Replacement Key
  • Alternate key
  • Surrogate key

What is the purpose of the User Interface Builder Class?

Answer: Link

Overloading vs Overriding?

Answer: Link

What is the concept of Extension?

Answer: Link

Define Recordset Operations and types.

Answer: Link

We have 3 types of Bulk Operations
  • insert_recordset
  • update_recordset
  • delete_from

Which development environment is used in D365FO?

Answer: Visual Studio IDE.

What is ConView()?
Answer: It is a Global class that can be utilized to view elements in a tree format.

What is COC (Chain of Command)?

Answer: In D365, To create the extension of the Protected access specified class, We mainly used COC, COC can also be used for a public class. For extension, we used the keyword Extension of(). We use the keyword final to define the class. we use the keyword next to call the base class methods.

Difference between COC and Event Handler?

Answer Link

Why are that people prefer TempDB over InMemory tables while using them in queries for reports?

Answer: Despite both being temporary type tables.

TempDB
You can use TempDB Table the SSRS report with a large amount of data without losing performance.
TempDB tables can be used in Joins with the other temp table or regular table.

Memory Table
Slow performance with a huge number of records.
couldn't use the inMemory table in the query with a join.

How to update the cross-reference tool?

Answer: Link

What is a model in Dynamics 365 Finance & Operations?

Answer: A model is a design-time concept, for example, a warehouse management model or a project accounting model. A model always belongs to a package. A package is a deployment and compilation unit of one or more models. It includes model metadata, binaries, and other associated resources.


What is the difference between package and model in d365?

Answer: A model is a group of elements, such as metadata and source files, that typically constitute a distributable software solution and includes customizations of an existing solution.

A package is a deployment and compilation unit of one or more models. It includes model metadata, binaries, and other associated resources.


What is Extensible Data Security?

Answer: Link

What is the Sysoperation framework how it is different from the runbase framework?

Answer: Link

What is the difference between the configuration key & the security key?

Answer: Link

What is an intermediate language?

Answer Link

How to Import & Export Model?

Answer: Link

How to Import & Export Project?
Answer: Link

What is Microsoft LCS?
Lifecycle Services (LCS) for Microsoft Dynamics is a collaboration portal that provides an environment and a set of regularly updated services that can help you manage the application lifecycle of your implementations of the Dynamics 365 Finance and Operations apps.

Difference between List, set, container and Map?
Answer: Link


What Is An Index?

 


What Is An Index?



A SQL index is used to retrieve data from a database very fast. Indexing a table or view is, without a doubt, one of the best ways to improve the performance of queries and applications.

A SQL index is a quick lookup table for finding records users need to search frequently. An index is small, fast, and optimized for quick lookups. It is very useful for connecting relational tables and searching large tables.

SQL indexes are primarily a performance tool, so they really apply if a database gets large. SQL Server supports several types of indexes but one of the most common types is the clustered index. This type of index is automatically created with a primary key. 

Multi-thread Imports in Dynamics 365 for Finance & Operations

  

Multi-thread Imports in Dynamics 365 for Finance & Operations


There are a few perplexities on how to multi-thread your imports in Dynamics 365 for Finance and Operations. The primary thing to know is that Microsoft prevents you from multi-threading a few entities, which may be a great thing. Usually great since in Flow AX 2012 you'll multi-thread any substance and fundamentally, in case the arrange of the records getting imported is critical, at that point, multi-threading isn’t a great alternative since in case they go out of grouping due to records getting imported in parallel you'll degenerate your data.

The moment the thing to know is how to set up multi-threading in Dynamics 365FO. To do this you essentially go to Workspaces > Information Administration > Tap the System Parameters tile > Substance Settings tab > Press Arrange Substance Execution Parameters.



Within the entity execution parameters, you're defining how numerous strings ought to be utilized after you purport a substance in the group. You wish to get it merely can set up a substance at different times in this frame as seen below.





You're defining three things within the form:

Entity – What substance are you setting multi-threading up for.

Import threshold– The threshold tells the framework how numerous records got to be imported to utilize this line.

Import record count – How numerous strings ought to be utilized; aka how numerous assignments are created. Example Let’s walk through an illustration of bringing in the “Customer Definition” substance with 1360 records.

Next Actions

Create Import project, add the entity you added earlier in the execution parameters form, and import in the batch mode.


To verify multi threading go to the Batch job screen, find your job and check in the view task screen, you will find multiple threads of your task.


Archive inventory transactions D365FO

 

Archive inventory transactions


D365FO Data Archiving


In the 1st Quater of 2021, Microsoft introduced the most wanted feature of data archiving, although it is still not available for all tables and targeting only inventTrans table for the time being.

As per Microsoft's fast track, archiving is a very crucial topic for them, they know very well from time to time data of their client increase and client start registring the performance issues. 

Let's move to the topic:


How you can turn on this feature in your environment?


Navigate to Feature management, and turn on the Inventory transactions archive feature.



Important
Once you enabled the feature then you couldn't disable it again.


Prerequisites

Only during the times when the following conditions are met can inventory transactions be archived:

The ledger period has to come to an end.

Inventory closing must be performed on or after the archive's to-period date.

The period must begin at least one year before the archive's from-period date.

There must be no inventory recalculations in place.

Archive inventory transactions


Follow these steps to save inventory transactions.

To access the inventory transaction archive, go to Inventory management > Periodic chores > Clean up > Inventory transaction archive.

The Inventory Transactions Archive page appears, displaying a list of processed records that have been archived.



To generate an inventory transaction archive, go to the Action Pane and select Inventory transactions archive.

Set the following fields in the Inventory transactions archive dialog box's parameters FastTab:

Follow these steps to save inventory transactions.

  • Choose the earliest transaction date in the closed ledger period to include in the archive.
  • Select the most recent transaction date to include in the archive in the closed ledger period.



Set up batch processing details as needed on the Run in the background FastTab. In Microsoft Dynamics 365 Supply Chain Management, follow the regular batch job instructions.

Choose OK.

You get a notification asking you to confirm that you wish to keep going. If you want to continue, read the information carefully and select Yes.

The Inventory Transaction Logs Archive page shows your complete archive history. Each row in the grid displays information such as the date the file was created, the user who created it, and its status.




D365FO Batch Job retries

 

Enable automatic retries on batch jobs


In the April/May released, Microsoft introduces the feature to retry the batch job, In the past, if the Finance and Operations apps experience any kind of loss of connection to Microsoft SQL Server, then all batch jobs that are running fail, and the reason is, on Azure if the connection is lost then the recovery is almost impossible.

So, Microsoft introduces an Interface class that will batch to reset itself in case of connection loss or failure.

How you can implement it in your code?

Here is an example if you are using the "RunBaseBatch" class in your batch Job.

class TestBatchJob extends RunBaseBatch implements BatchRetryable

{

    [Wrappable(true), Replaceable(true)] // Change to meet your customizability requirements

    public boolean isRetryable() // Use final if you want to prevent overriding

    {

        return true; 

    }

}

 Here is an example if you are using the "SysOperationServiceController" class in your batch Job.

class TestBatchJob extends SysOperationServiceController implements BatchRetryable

{

    [Wrappable(true), Replaceable(true)] // Change to meet your customizability requirements

    public boolean isRetryable() // Use final if you want to prevent overriding

    {

        return true;

    }

}


Important

If you designing a multithreading job and adding the runtime task, then you should implement this interface on both the main controller and the task controller.


If you want to disable the retry of the batch job then add your class in the 

Batch class configuration overrides 


Overrides setup


Active batch periods

 


Reference link

 It was the most demanding requirement from almost every client that they want to execute some XYZ batch after some X number of hours but only after office hours. As we all know this feature was not available in any previous version of AX/D365FO. But with the release of Platform update 21, an additional level of control over when batch jobs screen is now available. 

Scenario covered:

You can configure the hourly Batch job 7 days a week, and can strict the job to execute only the given time window.

Scenario not covered:

You want to configure the hourly Batch job 7 days a week, But you want to execute the job for entire days on weekends or holidays.

Instruction to implement the feature

Go to System administration > Setup > Active periods for batch jobs.

References screenshot



Enter the name of the batch period group, and specify the start and end dates time that the batch job will be active, and save the records.

References screenshot


now go
 to System administration > Inquiries > Batch jobs and find your targeted batch job.

References screenshot


Find the active period column, click on edit and 
select the active period that you want to assign, and then click Save.

References screenshot









Release Product Using X++

 


Release Product Using X++

You can use the below code to release the product to any legal entity using X++ or a custom batch job.

Sample Code

static void ReleaseProducts(Args _args)

{

EcoResProduct ecoResProduct;

;

select firstOnly ecoResProduct where EcoResProduct.DisplayProductNumber == "7042"; //Audio system

EcoResProductReleaseManagerBase::releaseProduct(ecoResProduct.RecId,

CompanyInfo::findDataArea('USMF').RecId);

}


Data Task Automation In Simple Steps

  

Data Task Automation

Data task automation lets you easily repeat many types of data tasks and validate the outcome of each task. Data task automation is very useful for projects that are in the implementation phase. For example, you can automate the creation and configuration of data projects

 

Scenario to Move legal entity from golden server to Prod.

One of my ex-company customers was already live with different legal entities and every time they spent a lot of time configuring the company on prod or any other environment manually.

 

So I gave them a suggestion to use Data Task automation, the use of this feature is very easy, you only need to identify the data entities and their execution sequence at once and then you can execute the same package for every environment to configure the new legal entity.

Instructions

 

Identify the Data entities.

Identify their execution sequence

Data packages

Manifest configuration

Package Execution on the Source environment

Package Data Cleansing

Package Execution on Target environment

 

 

Identify the Data Entities

            We will begin to identify all the data entities that are involved in our all OOTB/Custom setups configuration and incorporate the missing fields (if any) and create the new data entities as per your requirement.

Identify their execution sequence

            You need to make sure the sequence of the data entity execution is correct otherwise you have the data dependency issue during the execution of the package.

Data packages

On your developer server navigate to the DMF and create the export data source packages, it is completely up to your approach, you can create a single package with all data entities or can segregate them into different packages module wise or as you want.

Download the packages you created recently and upload them Over LCS>Assets Library>Data Packages section.

Manifest configuration

The next the step is to configure the manifest, in this article we will create two manifests one for export the data and 2nd to import the data.

Sample manifest

<?xml version='1.0' encoding='utf-8'?>

<TestManifest name='Data management demo data set up'>

    <SharedSetup />

        <JobDefinition ID='ImportJobDefinition_1' />

        <EntitySetup ID='Generic' />

    </SharedSetup>

    <TestGroup />

</TestManifest>

 

Manifest Structure

Shared setup

The shared setup section defines general task parameters and behaviors for all tasks in the manifest.

Data files

Here we define the packages information, which we create earlier and uploaded to the LCS project assets library, we can the data package from Shared asset library.

Project Asset library XML sample              

<DataFile ID='SharedLibrary' name='Demo data-7.3-100-System and Shared'  assetType='Data package' lcsProjectId=''/>

Shared Asset library XML sample

<DataFile ID=ProjectSetup name='Your Package Name Shared'  assetType='Data package' lcsProjectId='X1234654'/>

LCS Project Id is the only difference that gives the command to Data Task Automation the process to pick the file either from the shared Library or from the project library

 

We can have multiple data files in the same manifest, which can target the different packages.

 

 

 

 

Job Definition

Sample XML

<JobDefinition ID='ImportJobDefinition_1'>

    <Operation>Import</Operation>

    <ConfigurationOnly>No</ConfigurationOnly>

    <Truncate></Truncate>

    <Mode>Import async</Mode>

    <BatchFrequencyInMinutes>1</BatchFrequencyInMinutes>

    <NumberOfTimesToRunBatch >2</NumberOfTimesToRunBatch>

    <UploadFrequencyInSeconds>1</UploadFrequencyInSeconds>

    <TotalNumberOfTimesToUploadFile>1</TotalNumberOfTimesToUploadFile>

    <SupportedDataSourceType>Package</SupportedDataSourceType>

    <ProcessMessagesInOrder>No</ProcessMessagesInOrder>

    <PreventUploadWhenZeroRecords>No</PreventUploadWhenZeroRecords>

    <UseCompanyFromMessage>Yes</UseCompanyFromMessage>

    <LegalEntity>DAT</LegalEntity>

    <PackageAPIExecute>true</PackageAPIExecute>

    <PackageAPIOverwrite>false</PackageAPIOverwrite>

    <PackageAPIReexecute>false</PackageAPIReexecute>

    <DefinitionGroupID>TestExport</DefinitionGroupID>

    <PackageName>TestExportPackage</PackageName>

</JobDefinition>

 

Job Definition ID

You can write any unique name here as Job Definition Id.

 

Operation

Here we define the operation type, either import or export.

 

Configuration Only

It’s means you need to configure or want to configure + execute. If you want to configure + execution then set value “NO” or if you want to configure the project only then set value  “Yes”

 

Truncate

This fields always set as “NO”.

 

 

 

Test Group

We can have multiple test group in the same manifest file.

Sample XML

<TestGroup name='Set up Financials'>

    <TestCase Title='Import shared set up data package' ID='3933885' RepeatCount='1' TraceParser='off' TimeOut='20'>

        <DataFile RefID= ‘ProjectSetup’ />

        <JobDefinition RefID='ImportJobDefinition_1' />

        <EntitySetup RefID='Generic' />

    </TestCase> </TestGroup>

Name:

Here you write any unique name of the test group

Title:

The data-task automation process will use the name to create the data packages during execution.

Repeat Count:

How many times you want to process this section in one execution.

Time-Out:

Here we mentioned the maximum time package will wait and execute the package.

 

Package Execution on the Source environment

 

Once you completed with your export manifest, upload it on the manifest on the source environment, the data task automation will create some tasks and use the data packages you uploaded earlier on LCS to create the data export package on your source server.

 

In my scenario, I have created and uploaded these packages on LCS.

Packages:

·        Package_AccountPayable

·        Package_AccountReceivable

·        Package_CashBank

·        Package_CreditCollection

·        Package_ExpenseManagement

·        Package_FixedAssets

·        Package_GeneralJournal

·        Package_Inventory

·        Package_ProcurmentSourcing

·        Package_ProjectManagement

·        Package_SysAdmin

·        Package_TaxAuthority

 

 

Reference Screenshot:

 

Generate Data Source

In order to generate the data source files, we need to upload our export manifest file in the Data Task Automation framework.

Pre-Requisite

Before uploading the manifest file to start the process, please go to the number sequence screen and enable the property to allow the user “changes” from “NO” to “YES”.

Please make sure after completion of the activity revert back the changes from both environments.

Reference screenshot:


Steps to Generate Data Source

Steps-1:

Login to your source machine and navigate to Data management workspace, then click on the Data Task automation tile.

Reference screenshot:



 

Step-2:

Upload the Manifest file on the source machine.

Reference screenshot:


 

After 3 or 4 seconds, 12 separated Export tasks will appear on your screen, and if it’s not appearing itself then refresh the screen.

Reference screenshot:



Step-3:

Execute the task either one by one or you can execute all of them at once. The application will follow the defined sequence and Generate and start exporting the data following the given sequence instruction.

The overall export will take 20 to 30 minutes. (Depends on the volume of the data).

Reference Screenshot:



 

Step-4:

Please download all these packages and add the prefix of the legal entity you are planning to move from source to destination.



Data Cleansing

Here we need to perform some manual data cleansing because some of the entities export the cross-company information.

Step-5:

Remove the other entities' information from the files one by one (if any), otherwise, data task automation overrides the state of other entities' information, which may create problems for us. So, it’s better to perform data cleansing on file level.

In my case, I performed the data cleansing from the following files.

·        Account structure activation

·        Account structure allowed values

·        Account structures

·        Accounts for automatic transactions

·        Chart of accounts

·        Fiscal calendar integration entity

·        Fiscal calendar period

·        Fiscal calendar

·        Ledger fiscal calendar period

·        Ledger fiscal calendar year

·        Ledger settlement accounts

·        Ledger

·        Main account

·        Number sequence code V2

·        Number sequence references V2

Data Import Steps

Step-6:

Set the prefix of your legal entity name against every folder you download in step 4.



 

Step-7:

Now Upload the data packages on LCS like we did in the start

Step-8:

Zip the folder again and upload it to LCS.



 

Step-9:

Now login to you Target the environment and upload the manifest after performing little change in the XML-like name changing at Data File section.

Before Change



After change



 

 

Before change



After change



 

Step-12

Navigate to the Data task management screen of the destination server and Upload the Manifest a file like we did earlier for export.

All the tasks will appear on your screen in between 2 or 3 seconds or you can refresh the screen manually.

The sequence of execution In my Case

Please execute the task using the following sequence.

General Journal Module 
Cash & Bank
Tax Authority
Expense & Management
Fixed Assets

Product Information (As product hierarchy and Product categories are global level setups, so we don’t need to execute this step.) We can release the product using the OOTB job which will work faster than the import process.

Credit & Collection
Inventory
Account Receivable
Account Payable
Project Management
System Administration

 

Important

If any new Financial dimension introduces, then will suggest creating them manually on the target systems and activate them before start execution of the packages.


Virtual Fields Vs Computed Fields

  Virtual Field: A virtual field in D365FO is a field that doesn't have a direct representation in the database. It's a field that y...