Export to Azure Data Lake Overview4 min read

The solution has been developed to enhance you and your team to connect the F&O environment to Data Lake, unlocking insights that are hidden in your data and come with your subscription to Dynamics 365.

The only step is required to start is to install the Export to Azure Data Lake add-in in LCS, which will specify parameters associated with your Azure storage account.

Eventually, the add-in will empower a micro-service, allowing you to determine data to be exported – periodically or optionally, in near real-time.



Slipping to the advantages of the solution, we should start with the ability of the service to export process, which eliminates the waste amount of time usually required on exports management and monitoring. 

Furthermore, it reduces the burden on your F&O workloads, so you don’t need to worry about the exports themselves adversely impacting performance elsewhere. Additionally, the cost of data storage in the lake is significantly reduced in comparison to relational databases.


The interface has been simplified to let users select data via tables, meaning, there is no further need to develop custom entities just to export data. Should you have entities with data you want to export, all you need to do is select that data to export by choosing the entities. The related tables will be selected in the backend part automatically based on your entity selection.


Then, the stored data is organized in a folder structure in a common data model format, providing additional metadata in a machine-readable JavaScript/JSON format to allow downstream tools to determine the semantics of the data. These tools include modern data warehouse technologies such as Power BI, Azure, Synapse Analytics, and more.


Once the feature has been adopted, you may transit things like BYOD without incurring major costs, since it’s possible for existing downstream consumption pipelines to be preserved in many cases. 


Many of the existing reporting tools work directly with SQL databases, using SQL to actually read the data – which is good but may not be enough. With the solution, you are able to create a SQL server endpoint for usage in your lake through Azure Synapse Analytics. Since it includes SQL on-demand capability, Data Lake can be queried through SQL language.

Your data integration pipeline might also be able to consume files in Data Lake.

How to Get Started?

Here are the steps required to get started with the feature.

Step 1

Ensure your F&O environment is in a coverable region (i.e. US, UK, Canada, East Asia, etc.) in order to be able to install the export to Data Lake add-in.

Step 2

The next thing to do is to configure the Azure resources in your subscription, including the Azure Active Directory application, storage account, key vault, and your secrets.

Step 3

Then, the add-in that enables the microservice is installed in specific environments and during the installation process, you will have to specify Azure resources. 

Step 4

Once installed, select data for export and then the system will manage the export on an outgoing basis, automatically.

Step 5

Finally, you can join the Yammer Group to stay in touch and ask questions that will help you understand the feature as well as upcoming improvements and enhancements.

Connecting to Dynamics 365

Let’s also take a few quick glimpses of the steps needed to connect the data lake to Dynamics 365.

In order to connect Data Lake to Dynamics 365, you will need to utilize LCS to select a tier 2 (or higher) environment and configure the add-in with the settings that comply with your storage account and key vault.

Keep in mind that, since this configuration will be stored in LCS (and not in the environment database) – the configuration will be independent of any data refresh that you might do.

Once connected, teams will have the ability to choose data by browsing entities and selecting required tables through a simplified interface. When the data has been selected, the export will be initialized to produce the export automatically.

Modern Analytics Architecture in Azure

Traditional intelligence capabilities (including an existing library of SSRS reports), beloved Excel, embedded Power BI, and electronic reporting – these would be great if not a bit outdated, compared to the new solution.

The export to Azure Data Lake feature provides a much easier way to export data in your lake. And once the data is exported – it will be automatically organized in the CDM format along with various pieces of metadata such as the entity shapes. And, you can combine that with data from other sources, legacy databases, and so on.