How-to set up BYOD
BYOD uses the Export services of Dynamics 365 for Finance and Operations (D365 F&O), found in the “Modules > System Administration > Data Management” workspace of the environment.

The first step is to Configure the entities to export to your external database. I will use an Azure SQL database as the destination for the BYOD setup. This step involves defining a connection string to the Azure SQL database and selecting the entities to “publish”. Publish here means that the “create table” command is executed on the Azure SQL database.
Step 2 is to Export the data, where you will have the option to set up recurring exports on a schedule.
Let’s begin. We will walk through the steps to set up BYOD.
Step 1: set up configuration to Azure SQL DB
Click “Modules > System Administration > Data Management”:

Click “Configure Entity export to database”:

Click “New” to define a connection string to the Azure SQL database:

Enter a Source Name and Description for this configuration.
The Type should be defaulted to “Azure SQL DB”.
Enter a Connection string and set the value of “Create clustered column store indexes” to True only if your Azure SQL database is a Premium tier that utilizes column store indexes, otherwise the creation of this configuration will fail. Click Validate to ensure you can connect to the Azure SQL database from within D365FO.

Note the format of the Connection string must be as follows (we are using a local SQL account, hence the value of “False” for Integrated Security followed by a User ID and Password):
“Data Source=xxxxxxxx.database.windows.net,1433; Initial Catalog=xxxxxx; Integrated Security=False; User ID=xxxxxxxx; Password=xxxxxxxxxx”
Once the connection string is validated, you can Hide it (if you prefer). Now you will need to Publish the entities. Click “Publish”.
Note: you can exit the connection string workspace, then highlight the connection, and click “Publish” there, as shown:

Both ways will take you to a workspace titled “Target entities”, where you can scroll through all entities or search for specific ones, highlighting the entities you want to export and clicking “Publish” again.
There is the option to set CHANGE TRACKING for each entity. If you do decide to set this feature, you will be able to export incrementally. If you do not set this, then you can only do a full export. This feature must be set here and for each entity – exporting the data to those entities will be shown in step 2 below.

As mentioned earlier, publish here means that the “create table” command is executed on the Azure SQL database.
You will see a popup that indicates your job has been published:

Next, you will receive a success notification if things went well, or you will see errors/warnings in the notifications that will need your attention before you can proceed.
At this point, if all went well, you should see the entity table(s) in the Azure SQL database. The next step is to export the data into those tables.
Step 2: export data to Azure SQL DB
To export data, you will need to return to the “Modules > System Administration > Data Management” workspace and click “Export”:

You will now need to define an export job. Enter a Name, choose a Target data format (the one created in step 1), and select one Entity name at a time (only choose those entities Published in step 1, one at a time), as shown:

Note that you also have the option to choose a Default refresh type (“Incremental” or “Full push only”). “Incremental” refresh type should only be chosen if we turned on Change Tracking in step 1. Otherwise, select “Full push only”.
Click Add entity, and you will see the following message:

After which you will see the entity listed on the right side, under “SELECTED FILES AND ENTITIES”, as follows:

Repeat this process to add as many entities to this export job as you’d like (as long as those entities were published in step 1). Finally, click “Save” in the top left and you will save this export job, as well as run it:

You will be able to monitor the progress of the job run and see data being exported to the Azure SQL database:

Once completed, you will see the Execution summary for the job, as shown:

And a notification of success will also appear:

Finally, back in the “Modules > System Administration > Data Management” workspace, you will see your newly created and run job appear under “Data projects” and “Job history”:


How-to search system logs for BYOD (and other) errors in D365 F&O
In this article I will show how to search the system logs for errors in the running of BYOD jobs. The obvious place to look for error messages is in the Job history of the BYOD job, but sometimes the message there is not detailed enough to understand what caused your job to fail.
Example
Here, I have a BYOD job containing 2 entities, one of which failed. In order to see the error message, I click on “Execution details” in the Job history and then “View execution log”:

In the execution log, there are a few things to look for, the first obviously being the error message itself. However, the error message may be too generic or even non-existent.

You will notice that the Staging status is ‘Ended’ meaning that the staging of the data to be imported in the BYOD job ran without issue, but the Target status is in ‘Error’. This means that the writing of the data into the destination caused the job to error out. The Log text for the Source (i.e. staging) is displayed and tells us that “’154316′ ‘Sales transactions’ record(s) inserted in staging”
Then, by selecting the Target, you can see the Log text for the staging part of the run does not give any details:

So, what do we do now? It is apparent that the Job history does not give us enough details (in this case) to resolve the issue. And we simply cannot do nothing because the error will occur again in the next run of the BYOD job.
Let’s log on to the LCS and look through the raw log files. You may also hear this log referred to as the ‘telemetry log’.
How-to search system logs for BYOD (and other) errors
Log on to the LCS and select the implementation for your company:

Click “Full details” for the environment in which the error occurred:

Click on “Environment monitoring” in the bottom half of the screen:

Next, click on “View raw logs”:

Enter a Start date, an End date, and any Search terms (comma separated), then click “Search”:

Note: In the drop down box for Query name, you will see a listing of events through which you can search:

Note: If you click “Show options”, you will be able to select how many rows to bring back and make the search time non-UTC:

After you click “Search”, you will see a message that the log files are being searched for your Search terms:

Once the search results are returned, you will see a listing of activities with many columns of information. Scroll to the right until you see column titled LevelName. This column contains keywords such as Information and Error:

Click on the column header LevelName (or alternatively, scroll downwards until you see “Error”):

Scroll to the right until you see the column titled exceptionMessage, which will contain the message associated with that error:

You can copy & paste this message into another file/document to help you decipher its meaning and to determine steps to rectify the issue.
When finished, click the X in the top right corner to close this search.

HI,
That is a wonderful article. Thank you very much 🙂
LikeLike